Scalable Tensor Factorizations with Incomplete Data

Size: px
Start display at page:

Download "Scalable Tensor Factorizations with Incomplete Data"

Transcription

1 Scalable Tensor Factorizations with Incomplete Data Tamara G. Kolda & Daniel M. Dunlavy Sandia National Labs Evrim Acar Information Technologies Institute, TUBITAK-UEKAE, Turkey Morten Mørup Technical University of Denmark U.S. Department of Energy Office of Advanced Scientific Computing Research Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy s National Nuclear Security Administration under contract DE-AC04-94AL85000.

2 A Nice Way to Start the Day To: Kolda, Tamara G <tgkolda@sandia.gov> From: Engineer, Joe S <jsengin@sandia.gov> Subject: Help with a math problem July 15, 2010 T. G. Kolda - SIAM Annual 2

3 Matrix Factorization To: Engineer, Joe S <jsengin@sandia.gov> From: Kolda, Tamara G <tgkolda@sandia.gov> Subject: Re: Help with a math problem Dear Joe, No problem thanks to the handy Singular Value Decomposition (SVD)! Tammy p.s. Would you prefer Nonnegative Matrix Factorization (NMF), Independent Component Analysis (ICA), etc. July 15, 2010 T. G. Kolda - SIAM Annual 3

4 A Nicer Way to Start the Day To: Kolda, Tamara G <tgkolda@sandia.gov> From: Engineer, Joe S <jsengin@sandia.gov> Subject: Help with a math problem July 15, 2010 T. G. Kolda - SIAM Annual 4

5 Tensor Factorization To: Engineer, Joe S <jsengin@sandia.gov> From: Kolda, Tamara G <tgkolda@sandia.gov> Subject: Re: Help with a math problem Dear Joe, I factored your tensor using CANDECOMP/PARAFAC (CP) tensor decomposition. Though determining tensor rank is NP hard, I used heuristic methods to make a reasonable determination of the number of components. Tammy July 15, 2010 T. G. Kolda - SIAM Annual 5

6 A Better Way to Start the Day To: Kolda, Tamara G <tgkolda@sandia.gov> From: Engineer, Joe S <jsengin@sandia.gov> Subject: Help with a math problem Dear Tammy, Our measurement device had some snafus, and so we couldn t collect all the data. But we still need to factorize this (incomplete) matrix. Maybe we can just fill in the missing entries with zeros -Joe July 15, 2010 T. G. Kolda - SIAM Annual 6

7 Matrix Factorizations with Missing Data To: Engineer, Joe S <jsengin@sandia.gov> From: Kolda, Tamara G <tgkolda@sandia.gov> Subject: Re: Help with a math problem Dear Joe, Per Ruhe (1974): When the number of missing, observations is small, compared to the sample, it is common practice to replace the missing observations by means or extreme values, but when larger parts of the data matrix are empty this is no longer a sensible approach. However, many modern methods exist for factorizing matrices with missing data. A favorite reference is Buchanan & Fitzgibbon (CVPR 05). See also the work on matrix completion by Candès, Recht, and others. Most problems can be solved so long as there are enough known entries and they are distributed reasonably. Tammy July 15, 2010 T. G. Kolda - SIAM Annual 7

8 The Very Best Way to Start the Day To: Kolda, Tamara G <tgkolda@sandia.gov> From: Engineer, Joe S <jsengin@sandia.gov> Subject: Help with a math problem Dear Tammy, Our measurement device had some snafus, and so we couldn t collect all the data. But we still need to factorize this (incomplete) tensor. -Joe July 15, 2010 T. G. Kolda - SIAM Annual 8

9 Factor Example: Latent Semantic Indexing Berry, Dumais, O Brien (1995) Terms Matrix Representation 2-Factor Approximation car d1 service d2 error military repair d3 car repair service d1 d3 service military d2 Documents Gauge Freedom July 15, 2010 T. G. Kolda - SIAM Annual 9

10 Time Samples Time Samples Factor Example: Epilepsy Data measurements are recorded at multiple sites (channels) over time. The data is transformed via a continuous wavelet transform. Eye Artifact CWT Channels Frequency Time Frequency Channel Seizure Acar, Bingol, Bingol, Bro and Yener, Bioinformatics, Time Frequency Channel July 15, 2010 T. G. Kolda - SIAM Annual 10

11 A Whole Host of Tensor Decompositions CANDECOMP/PARAFAC (CP) Tucker PARATUCK2 (DEDICOM3) Block CP PARAFAC2 K. & Bader, Tensor Decompositions and Applications, SIAM Review, 2009 July 15, 2010 T. G. Kolda - SIAM Annual 11

12 Tensor Factorizations have Numerous Applications Modeling fluorescence excitation-emission data Signal processing Brain imaging (e.g., fmri) data Web graph plus anchor term analysis Image compression and classification; texture analysis Text analysis, e.g., multi-way LSI Approximating Newton potentials, stochastic PDEs, etc. Sidiropoulos, Giannakis, and Bro, IEEE Trans. Signal Processing, Sun, Tao, and Faloutsos, KDD 06. ERPWAVELAB by Morten Mørup. Furukawa, Kawasaki, Ikeuchi, and Sakauchi, EGRW '02 Doostan, Iaccarino, and Etemadi, J. Computational Physics, 2009 Hazan, Polak, and Shashua, ICCV Andersen and Bro, J. Chemometrics, July 15, 2010 T. G. Kolda - SIAM Annual 12

13 Matrix & Tensor Factor Analysis Singular Value Decomposition (SVD) or Principal Component Analysis (PCA) expresses a matrix as the sum of component rank-1 matrices. CANDECOMP/PARAFAC Decomposition (CP) expresses a tensor as the sum of component rank-1 tensors. (Hitchcock 27, Harshman 70, Carroll & Chang 70) July 15, 2010 T. G. Kolda - SIAM Annual 13

14 Matrices Tensors have an Advantage for Recovering Factors Tensors Scale & sign ambiguity Scale and sign ambiguity Permutation ambiguity Permutation ambiguity Gauge freedom for any invertible matrix S: Otherwise components unique under mild conditions! July 15, 2010 T. G. Kolda - SIAM Annual 14

15 Kruskal s Uniqueness Condition The k-rank of a matrix A, denoted k A, is the maximum value of k such that any k columns are linearly independent. An R-component factorization [[A,B,C]] is essentially unique* if: An R-component factorization [[A (1),, A (N) ]] is essentially unique* if: Kruskal (1977, 1989), Sidropolous & Bro (2000) * Essentially unique means up to permutation and scaling: = RxR permutation matrix so long as: July 15, 2010 T. G. Kolda - SIAM Annual 15

16 channels Motivation: EEG Data = Biomedical signal processing EEG (electroencephalogram) signals can be recorded using electrodes placed on the scalp Missing data problem occurs when Electrodes get loose or disconnected, causing the signal to be unusable Different experiments have overlapping but not identical channels time-frequency channel time-freq experiments Can we still do this calculation if data are missing July 15, 2010 T. G. Kolda - SIAM Annual 16

17 The Missing Data Problem Standard Problem: Given tensor X, find A, B, and C such that Typically formulated as a least squares problem. Missing Data Problem: Given a subset of the entries of X, find A, B, and C such that for the known entries. July 15, 2010 T. G. Kolda - SIAM Annual 17

18 Mathematical Formulation Define the weight tensor W such that Then the least squares problem is Elementwise product (.* in MATLAB) With a solution, the tensor can be completed via July 15, 2010 T. G. Kolda - SIAM Annual 18

19 64 channels Brain dynamics can be captured even extensive missing channels 4392 time-freq. = Number of Missing Channels Replace Missing Entries with Zero More Sensible Approach July 15, 2010 T. G. Kolda - SIAM Annual 19

20 64 channels Brain dynamics can be captured even extensive missing channels 4392 time-freq. = Number of Missing Channels Replace Missing Entries with Zero More Sensible Approach July 15, 2010 T. G. Kolda - SIAM Annual 20

21 64 channels Brain dynamics can be captured even extensive missing channels 4392 time-freq. = No Missing Data 30 Chan./Exp. Missing channel time-freq experiments channel time-freq experiments July 15, 2010 T. G. Kolda - SIAM Annual 21

22 Chemometrics Example Bro (1997): Fluorescence measurements of 5 samples containing 3 amino acids Tyrosine Tryptophan Phenylalanine Each amino acid corresponds to a rank-one components Tensor of size 5 x 51 x samples 51 excitations 201 emissions Factors in Emission Mode July 15, 2010 T. G. Kolda - SIAM Annual 22

23 No Missing Data [Movie] July 15, 2010 T. G. Kolda - SIAM Annual 23

24 No Missing Data 7 Iteration July 15, 2010 T. G. Kolda - SIAM Annual 24

25 Replacing 50% Missing Values with Mean [Movie] July 15, 2010 T. G. Kolda - SIAM Annual 25

26 Replacing 50% Missing Values with Mean 7 Iteration July 15, 2010 T. G. Kolda - SIAM Annual 26

27 50% Missing Data using Sensible Approach [Movie] July 15, 2010 T. G. Kolda - SIAM Annual 27

28 50% Missing Data using Sensible Approach 7 Iteration July 15, 2010 T. G. Kolda - SIAM Annual 28

29 75% Missing Data using Sensible Approach [Movie] July 15, 2010 T. G. Kolda - SIAM Annual 29

30 75% Missing Data using Sensible Approach 7 Iteration July 15, 2010 T. G. Kolda - SIAM Annual 30

31 Sensible Approach: Direct Optimization Goal is to minimize the following nonlinear optimization function: Fits only the known values. 2 nd order optimization Gauss-Newton as implemented in INDAFAC Size of Jacobian is (1-M) IJK [# equations] by R(I+J+K) [# variables] Tomasi & Bro, st order optimization Nonlinear CG as implemented in CP-WOPT (available in Tensor Toolbox) Acar, Dunlavy, K., Mørup, 2010 Alternative to direct optimization: EM-ALS Repeat: Impute missing values from model, update model Implementation in parafac from N-Way Toolbox (Bro et al.) July 15, 2010 T. G. Kolda - SIAM Annual 31

32 EM-ALS Algorithm Implementation: parafac in N-way Toolbox by Bro et al. 1. Initialize matrices A, B, C 2. Complete X using the current model: 3. Compute a new A, B, C that best fit the completed X via one or more iterations of alternating least squares (ALS): 4. Go back to step 2 (until convergence) July 15, 2010 T. G. Kolda - SIAM Annual 32

33 Exploiting Sparsity in CP-WOPT Observe that Y and Z are sparse if W is sparse: Sparse calculation between two tensors with the same set of structural nonzeros. Gradient can be calculated as follows: Matrix Khatri-Rao Unfolded Tensor Product More missing data yields a sparser problem! July 15, 2010 T. G. Kolda - SIAM Annual 33

34 Step 1: Generate factor matrices with random entries drawn from N(0,1). Each matrix has R=5 columns. Normalize each column to length 1. Numerical Experiment Set-up for Simulated Data Step 2: Generate tensors and add noise ( = 10%). = Step 3: Randomly remove entries or entire fibers or Step 4: Factorize incomplete tensor CP-WOPT Dense or Sparse INDAFAC (Gauss-Newton) EM-ALS Step 5: Compute FMS = factor match score of computed factors against truth. Assume columns are normalized and» r is the product of the norms. Best FMS is 1 July 15, 2010 T. G. Kolda - SIAM Annual 34

35 Missing Entries 50 x 40 x 30, R=5 No Difference in Accuracy Between Methods 80% Missing 90% Missing 95% Missing July 15, 2010 T. G. Kolda - SIAM Annual 35

36 Missing Entries 50 x 40 x 30, R=5 No Difference in Accuracy Between Methods 80% Missing 90% Missing 95% Missing 95% missing is pretty hard at this size. Number of starts! July 15, 2010 T. G. Kolda - SIAM Annual 36

37 95% Missing Entries, R=5 Larger Problems are Easier 50 x 40 x x 80 x x 120 x 90 Number of starts! July 15, 2010 T. G. Kolda - SIAM Annual 37

38 80% Missing 50 x 40 x 30, R=5 Structured Missing Data is Harder Missing Entries Missing Fibers July 15, 2010 T. G. Kolda - SIAM Annual 38

39 Exploiting Sparsity for Speed 80% Missing 90% Missing 95% Missing July 15, 2010 T. G. Kolda - SIAM Annual 39

40 Time Time FMS FMS CP-WOPT scales to larger problems! 500 x 500 x 500 with M=99% (1.25 million nonzeros) 1 Dense storage = 1GB Sparse storage = 40MB 1000 x 1000 x 1000 with M = 99.5% (5 million nonzeros) 1 Dense storage = 8GB Sparse storage = 160MB Restarted and converged to correct solution! July 15, 2010 T. G. Kolda - SIAM Annual 40

41 The Very Best Way to Start the Day To: Engineer, Joe S <jsengin@sandia.gov> From: Kolda, Tamara G <tgkolda@sandia.gov> Subject: Re: Help with a math problem Joe, Here you go. There s still a lot of work to do on this topic, but I think you ll find this to be a workable solution in the short term. Thanks for the opportunity to work on such an interesting problem. -Tammy July 15, 2010 T. G. Kolda - SIAM Annual 41

42 Challenges for Missing Data Problems Ubiquitous Real-World Applications Data doesn t perfectly fit (multi-) linear model Missing data patterns may be structured Theory Many recent discoveries in matrix case Need similar ideas (e.g., coherence, nuclear norm ) for tensors! Algorithms Extensions to other tensor factorizations Purposely dropping data for memory/speed of computation or maybe to avoid local minima Rank determination For more info: Tammy Kolda tgkolda@sandia.gov E. Acar, D. M. Dunlavy, T. G. Kolda and M. Mørup. Scalable Tensor Factorizations for Incomplete Data, arxiv: v1, May July 15, 2010 T. G. Kolda - SIAM Annual 42

43 Back-Up Slides July 15, 2010 T. G. Kolda - SIAM Annual 43

44 Tensor Completion for Internet Traffic Matrices Network traffic data Comprises source-destination information over time (tensor) Missing data arises due to various collection issues Our data set One month of Géant data 23 x 23 x 2756 (15 min intervals) Modeled using two components = + Error in predicting missing elements as % missing data increases. Baseline model error July 15, 2010 T. G. Kolda - SIAM Annual 48

Fitting a Tensor Decomposition is a Nonlinear Optimization Problem

Fitting a Tensor Decomposition is a Nonlinear Optimization Problem Fitting a Tensor Decomposition is a Nonlinear Optimization Problem Evrim Acar, Daniel M. Dunlavy, and Tamara G. Kolda* Sandia National Laboratories Sandia is a multiprogram laboratory operated by Sandia

More information

The Canonical Tensor Decomposition and Its Applications to Social Network Analysis

The Canonical Tensor Decomposition and Its Applications to Social Network Analysis The Canonical Tensor Decomposition and Its Applications to Social Network Analysis Evrim Acar, Tamara G. Kolda and Daniel M. Dunlavy Sandia National Labs Sandia is a multiprogram laboratory operated by

More information

The multiple-vector tensor-vector product

The multiple-vector tensor-vector product I TD MTVP C KU Leuven August 29, 2013 In collaboration with: N Vanbaelen, K Meerbergen, and R Vandebril Overview I TD MTVP C 1 Introduction Inspiring example Notation 2 Tensor decompositions The CP decomposition

More information

A randomized block sampling approach to the canonical polyadic decomposition of large-scale tensors

A randomized block sampling approach to the canonical polyadic decomposition of large-scale tensors A randomized block sampling approach to the canonical polyadic decomposition of large-scale tensors Nico Vervliet Joint work with Lieven De Lathauwer SIAM AN17, July 13, 2017 2 Classification of hazardous

More information

Faloutsos, Tong ICDE, 2009

Faloutsos, Tong ICDE, 2009 Large Graph Mining: Patterns, Tools and Case Studies Christos Faloutsos Hanghang Tong CMU Copyright: Faloutsos, Tong (29) 2-1 Outline Part 1: Patterns Part 2: Matrix and Tensor Tools Part 3: Proximity

More information

Tensor Decompositions and Applications

Tensor Decompositions and Applications Tamara G. Kolda and Brett W. Bader Part I September 22, 2015 What is tensor? A N-th order tensor is an element of the tensor product of N vector spaces, each of which has its own coordinate system. a =

More information

Must-read Material : Multimedia Databases and Data Mining. Indexing - Detailed outline. Outline. Faloutsos

Must-read Material : Multimedia Databases and Data Mining. Indexing - Detailed outline. Outline. Faloutsos Must-read Material 15-826: Multimedia Databases and Data Mining Tamara G. Kolda and Brett W. Bader. Tensor decompositions and applications. Technical Report SAND2007-6702, Sandia National Laboratories,

More information

CVPR A New Tensor Algebra - Tutorial. July 26, 2017

CVPR A New Tensor Algebra - Tutorial. July 26, 2017 CVPR 2017 A New Tensor Algebra - Tutorial Lior Horesh lhoresh@us.ibm.com Misha Kilmer misha.kilmer@tufts.edu July 26, 2017 Outline Motivation Background and notation New t-product and associated algebraic

More information

Illustration by Chris Brigman IPAM Workshop: Big Data Meets Computation February 2, 2017

Illustration by Chris Brigman IPAM Workshop: Big Data Meets Computation February 2, 2017 Illustration by Chris Brigman IPAM Workshop: Big Data Meets Computation February 2, 2017 Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly

More information

A FLEXIBLE MODELING FRAMEWORK FOR COUPLED MATRIX AND TENSOR FACTORIZATIONS

A FLEXIBLE MODELING FRAMEWORK FOR COUPLED MATRIX AND TENSOR FACTORIZATIONS A FLEXIBLE MODELING FRAMEWORK FOR COUPLED MATRIX AND TENSOR FACTORIZATIONS Evrim Acar, Mathias Nilsson, Michael Saunders University of Copenhagen, Faculty of Science, Frederiksberg C, Denmark University

More information

Illustration by Chris Brigman Michigan Institute for Data Science (MIDAS) Seminar November 9, 2015

Illustration by Chris Brigman Michigan Institute for Data Science (MIDAS) Seminar November 9, 2015 Illustration by Chris Brigman Michigan Institute for Data Science (MIDAS) Seminar November 9, 2015 Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation,

More information

ENGG5781 Matrix Analysis and Computations Lecture 10: Non-Negative Matrix Factorization and Tensor Decomposition

ENGG5781 Matrix Analysis and Computations Lecture 10: Non-Negative Matrix Factorization and Tensor Decomposition ENGG5781 Matrix Analysis and Computations Lecture 10: Non-Negative Matrix Factorization and Tensor Decomposition Wing-Kin (Ken) Ma 2017 2018 Term 2 Department of Electronic Engineering The Chinese University

More information

Tensor Analysis. Topics in Data Mining Fall Bruno Ribeiro

Tensor Analysis. Topics in Data Mining Fall Bruno Ribeiro Tensor Analysis Topics in Data Mining Fall 2015 Bruno Ribeiro Tensor Basics But First 2 Mining Matrices 3 Singular Value Decomposition (SVD) } X(i,j) = value of user i for property j i 2 j 5 X(Alice, cholesterol)

More information

MATRIX COMPLETION AND TENSOR RANK

MATRIX COMPLETION AND TENSOR RANK MATRIX COMPLETION AND TENSOR RANK HARM DERKSEN Abstract. In this paper, we show that the low rank matrix completion problem can be reduced to the problem of finding the rank of a certain tensor. arxiv:1302.2639v2

More information

Parallel Tensor Compression for Large-Scale Scientific Data

Parallel Tensor Compression for Large-Scale Scientific Data Parallel Tensor Compression for Large-Scale Scientific Data Woody Austin, Grey Ballard, Tamara G. Kolda April 14, 2016 SIAM Conference on Parallel Processing for Scientific Computing MS 44/52: Parallel

More information

Multi-Way Compressed Sensing for Big Tensor Data

Multi-Way Compressed Sensing for Big Tensor Data Multi-Way Compressed Sensing for Big Tensor Data Nikos Sidiropoulos Dept. ECE University of Minnesota MIIS, July 1, 2013 Nikos Sidiropoulos Dept. ECE University of Minnesota ()Multi-Way Compressed Sensing

More information

Recovering Tensor Data from Incomplete Measurement via Compressive Sampling

Recovering Tensor Data from Incomplete Measurement via Compressive Sampling Recovering Tensor Data from Incomplete Measurement via Compressive Sampling Jason R. Holloway hollowjr@clarkson.edu Carmeliza Navasca cnavasca@clarkson.edu Department of Electrical Engineering Clarkson

More information

Postgraduate Course Signal Processing for Big Data (MSc)

Postgraduate Course Signal Processing for Big Data (MSc) Postgraduate Course Signal Processing for Big Data (MSc) Jesús Gustavo Cuevas del Río E-mail: gustavo.cuevas@upm.es Work Phone: +34 91 549 57 00 Ext: 4039 Course Description Instructor Information Course

More information

Structured matrix factorizations. Example: Eigenfaces

Structured matrix factorizations. Example: Eigenfaces Structured matrix factorizations Example: Eigenfaces An extremely large variety of interesting and important problems in machine learning can be formulated as: Given a matrix, find a matrix and a matrix

More information

Tutorial on MATLAB for tensors and the Tucker decomposition

Tutorial on MATLAB for tensors and the Tucker decomposition Tutorial on MATLAB for tensors and the Tucker decomposition Tamara G. Kolda and Brett W. Bader Workshop on Tensor Decomposition and Applications CIRM, Luminy, Marseille, France August 29, 2005 Sandia is

More information

Window-based Tensor Analysis on High-dimensional and Multi-aspect Streams

Window-based Tensor Analysis on High-dimensional and Multi-aspect Streams Window-based Tensor Analysis on High-dimensional and Multi-aspect Streams Jimeng Sun Spiros Papadimitriou Philip S. Yu Carnegie Mellon University Pittsburgh, PA, USA IBM T.J. Watson Research Center Hawthorne,

More information

/16/$ IEEE 1728

/16/$ IEEE 1728 Extension of the Semi-Algebraic Framework for Approximate CP Decompositions via Simultaneous Matrix Diagonalization to the Efficient Calculation of Coupled CP Decompositions Kristina Naskovska and Martin

More information

Large Scale Tensor Decompositions: Algorithmic Developments and Applications

Large Scale Tensor Decompositions: Algorithmic Developments and Applications Large Scale Tensor Decompositions: Algorithmic Developments and Applications Evangelos Papalexakis, U Kang, Christos Faloutsos, Nicholas Sidiropoulos, Abhay Harpale Carnegie Mellon University, KAIST, University

More information

CP DECOMPOSITION AND ITS APPLICATION IN NOISE REDUCTION AND MULTIPLE SOURCES IDENTIFICATION

CP DECOMPOSITION AND ITS APPLICATION IN NOISE REDUCTION AND MULTIPLE SOURCES IDENTIFICATION International Conference on Computer Science and Intelligent Communication (CSIC ) CP DECOMPOSITION AND ITS APPLICATION IN NOISE REDUCTION AND MULTIPLE SOURCES IDENTIFICATION Xuefeng LIU, Yuping FENG,

More information

Three right directions and three wrong directions for tensor research

Three right directions and three wrong directions for tensor research Three right directions and three wrong directions for tensor research Michael W. Mahoney Stanford University ( For more info, see: http:// cs.stanford.edu/people/mmahoney/ or Google on Michael Mahoney

More information

Dealing with curse and blessing of dimensionality through tensor decompositions

Dealing with curse and blessing of dimensionality through tensor decompositions Dealing with curse and blessing of dimensionality through tensor decompositions Lieven De Lathauwer Joint work with Nico Vervliet, Martijn Boussé and Otto Debals June 26, 2017 2 Overview Curse of dimensionality

More information

Sparseness Constraints on Nonnegative Tensor Decomposition

Sparseness Constraints on Nonnegative Tensor Decomposition Sparseness Constraints on Nonnegative Tensor Decomposition Na Li nali@clarksonedu Carmeliza Navasca cnavasca@clarksonedu Department of Mathematics Clarkson University Potsdam, New York 3699, USA Department

More information

Bruce Hendrickson Discrete Algorithms & Math Dept. Sandia National Labs Albuquerque, New Mexico Also, CS Department, UNM

Bruce Hendrickson Discrete Algorithms & Math Dept. Sandia National Labs Albuquerque, New Mexico Also, CS Department, UNM Latent Semantic Analysis and Fiedler Retrieval Bruce Hendrickson Discrete Algorithms & Math Dept. Sandia National Labs Albuquerque, New Mexico Also, CS Department, UNM Informatics & Linear Algebra Eigenvectors

More information

Big Tensor Data Reduction

Big Tensor Data Reduction Big Tensor Data Reduction Nikos Sidiropoulos Dept. ECE University of Minnesota NSF/ECCS Big Data, 3/21/2013 Nikos Sidiropoulos Dept. ECE University of Minnesota () Big Tensor Data Reduction NSF/ECCS Big

More information

Optimization of Symmetric Tensor Computations

Optimization of Symmetric Tensor Computations Optimization of Symmetric Tensor Computations Jonathon Cai Department of Computer Science, Yale University New Haven, CT 0650 Email: jonathon.cai@yale.edu Muthu Baskaran, Benoît Meister, Richard Lethin

More information

Introduction to Tensors. 8 May 2014

Introduction to Tensors. 8 May 2014 Introduction to Tensors 8 May 2014 Introduction to Tensors What is a tensor? Basic Operations CP Decompositions and Tensor Rank Matricization and Computing the CP Dear Tullio,! I admire the elegance of

More information

Singular Value Decompsition

Singular Value Decompsition Singular Value Decompsition Massoud Malek One of the most useful results from linear algebra, is a matrix decomposition known as the singular value decomposition It has many useful applications in almost

More information

Computing nonnegative tensor factorizations

Computing nonnegative tensor factorizations Department of Computer Science Technical Report TR-2006-2 October 2006 (revised October 2007), University of British Columbia Computing nonnegative tensor factorizations Michael P. Friedlander Kathrin

More information

Coupled Matrix/Tensor Decompositions:

Coupled Matrix/Tensor Decompositions: Coupled Matrix/Tensor Decompositions: An Introduction Laurent Sorber Mikael Sørensen Marc Van Barel Lieven De Lathauwer KU Leuven Belgium Lieven.DeLathauwer@kuleuven-kulak.be 1 Canonical Polyadic Decomposition

More information

A BLIND SPARSE APPROACH FOR ESTIMATING CONSTRAINT MATRICES IN PARALIND DATA MODELS

A BLIND SPARSE APPROACH FOR ESTIMATING CONSTRAINT MATRICES IN PARALIND DATA MODELS 2th European Signal Processing Conference (EUSIPCO 22) Bucharest, Romania, August 27-3, 22 A BLIND SPARSE APPROACH FOR ESTIMATING CONSTRAINT MATRICES IN PARALIND DATA MODELS F. Caland,2, S. Miron 2 LIMOS,

More information

Computational Linear Algebra

Computational Linear Algebra Computational Linear Algebra PD Dr. rer. nat. habil. Ralf-Peter Mundani Computation in Engineering / BGU Scientific Computing in Computer Science / INF Winter Term 2018/19 Part 6: Some Other Stuff PD Dr.

More information

TENSOR COMPLETION THROUGH MULTIPLE KRONECKER PRODUCT DECOMPOSITION

TENSOR COMPLETION THROUGH MULTIPLE KRONECKER PRODUCT DECOMPOSITION TENSOR COMPLETION THROUGH MULTIPLE KRONECKER PRODUCT DECOMPOSITION Anh-Huy Phan, Andrzej Cichocki, Petr Tichavský, Gheorghe Luta 3, Austin Brockmeier 4 Brain Science Institute, RIKEN, Wakoshi, Japan Institute

More information

Local Feature Extraction Models from Incomplete Data in Face Recognition Based on Nonnegative Matrix Factorization

Local Feature Extraction Models from Incomplete Data in Face Recognition Based on Nonnegative Matrix Factorization American Journal of Software Engineering and Applications 2015; 4(3): 50-55 Published online May 12, 2015 (http://www.sciencepublishinggroup.com/j/ajsea) doi: 10.11648/j.ajsea.20150403.12 ISSN: 2327-2473

More information

arxiv: v1 [cs.lg] 18 Nov 2018

arxiv: v1 [cs.lg] 18 Nov 2018 THE CORE CONSISTENCY OF A COMPRESSED TENSOR Georgios Tsitsikas, Evangelos E. Papalexakis Dept. of Computer Science and Engineering University of California, Riverside arxiv:1811.7428v1 [cs.lg] 18 Nov 18

More information

1. Ignoring case, extract all unique words from the entire set of documents.

1. Ignoring case, extract all unique words from the entire set of documents. CS 378 Introduction to Data Mining Spring 29 Lecture 2 Lecturer: Inderjit Dhillon Date: Jan. 27th, 29 Keywords: Vector space model, Latent Semantic Indexing(LSI), SVD 1 Vector Space Model The basic idea

More information

Non-negative Tensor Factorization Based on Alternating Large-scale Non-negativity-constrained Least Squares

Non-negative Tensor Factorization Based on Alternating Large-scale Non-negativity-constrained Least Squares Non-negative Tensor Factorization Based on Alternating Large-scale Non-negativity-constrained Least Squares Hyunsoo Kim, Haesun Park College of Computing Georgia Institute of Technology 66 Ferst Drive,

More information

SPARSE TENSORS DECOMPOSITION SOFTWARE

SPARSE TENSORS DECOMPOSITION SOFTWARE SPARSE TENSORS DECOMPOSITION SOFTWARE Papa S Diaw, Master s Candidate Dr. Michael W. Berry, Major Professor Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville

More information

Fast Nonnegative Tensor Factorization with an Active-Set-Like Method

Fast Nonnegative Tensor Factorization with an Active-Set-Like Method Fast Nonnegative Tensor Factorization with an Active-Set-Like Method Jingu Kim and Haesun Park Abstract We introduce an efficient algorithm for computing a low-rank nonnegative CANDECOMP/PARAFAC(NNCP)

More information

Quick Introduction to Nonnegative Matrix Factorization

Quick Introduction to Nonnegative Matrix Factorization Quick Introduction to Nonnegative Matrix Factorization Norm Matloff University of California at Davis 1 The Goal Given an u v matrix A with nonnegative elements, we wish to find nonnegative, rank-k matrices

More information

CSC 576: Variants of Sparse Learning

CSC 576: Variants of Sparse Learning CSC 576: Variants of Sparse Learning Ji Liu Department of Computer Science, University of Rochester October 27, 205 Introduction Our previous note basically suggests using l norm to enforce sparsity in

More information

Information Retrieval

Information Retrieval Introduction to Information CS276: Information and Web Search Christopher Manning and Pandu Nayak Lecture 13: Latent Semantic Indexing Ch. 18 Today s topic Latent Semantic Indexing Term-document matrices

More information

Towards a Regression using Tensors

Towards a Regression using Tensors February 27, 2014 Outline Background 1 Background Linear Regression Tensorial Data Analysis 2 Definition Tensor Operation Tensor Decomposition 3 Model Attention Deficit Hyperactivity Disorder Data Analysis

More information

Data Mining and Matrices

Data Mining and Matrices Data Mining and Matrices 05 Semi-Discrete Decomposition Rainer Gemulla, Pauli Miettinen May 16, 2013 Outline 1 Hunting the Bump 2 Semi-Discrete Decomposition 3 The Algorithm 4 Applications SDD alone SVD

More information

Non-negative Tensor Factorization with missing data for the modeling of gene expressions in the Human Brain

Non-negative Tensor Factorization with missing data for the modeling of gene expressions in the Human Brain Downloaded from orbit.dtu.dk on: Dec 05, 2018 Non-negative Tensor Factorization with missing data for the modeling of gene expressions in the Human Brain Nielsen, Søren Føns Vind; Mørup, Morten Published

More information

Rank Selection in Low-rank Matrix Approximations: A Study of Cross-Validation for NMFs

Rank Selection in Low-rank Matrix Approximations: A Study of Cross-Validation for NMFs Rank Selection in Low-rank Matrix Approximations: A Study of Cross-Validation for NMFs Bhargav Kanagal Department of Computer Science University of Maryland College Park, MD 277 bhargav@cs.umd.edu Vikas

More information

Robust Principal Component Analysis

Robust Principal Component Analysis ELE 538B: Mathematics of High-Dimensional Data Robust Principal Component Analysis Yuxin Chen Princeton University, Fall 2018 Disentangling sparse and low-rank matrices Suppose we are given a matrix M

More information

Index. Copyright (c)2007 The Society for Industrial and Applied Mathematics From: Matrix Methods in Data Mining and Pattern Recgonition By: Lars Elden

Index. Copyright (c)2007 The Society for Industrial and Applied Mathematics From: Matrix Methods in Data Mining and Pattern Recgonition By: Lars Elden Index 1-norm, 15 matrix, 17 vector, 15 2-norm, 15, 59 matrix, 17 vector, 15 3-mode array, 91 absolute error, 15 adjacency matrix, 158 Aitken extrapolation, 157 algebra, multi-linear, 91 all-orthogonality,

More information

Third-Order Tensor Decompositions and Their Application in Quantum Chemistry

Third-Order Tensor Decompositions and Their Application in Quantum Chemistry Third-Order Tensor Decompositions and Their Application in Quantum Chemistry Tyler Ueltschi University of Puget SoundTacoma, Washington, USA tueltschi@pugetsound.edu April 14, 2014 1 Introduction A tensor

More information

ParCube: Sparse Parallelizable Tensor Decompositions

ParCube: Sparse Parallelizable Tensor Decompositions ParCube: Sparse Parallelizable Tensor Decompositions Evangelos E. Papalexakis, Christos Faloutsos, and Nicholas D. Sidiropoulos 2 School of Computer Science, Carnegie Mellon University, Pittsburgh, PA,

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear

More information

Notes on Tensor Decompositions

Notes on Tensor Decompositions Notes on Tensor Decompositions Alex Williams September 2016 These are some informal notes for a chalk-talk tutorial given at Janelia as part of the junior scientist workshop on theoretical neuroscience.

More information

Collaborative Filtering with Aspect-based Opinion Mining: A Tensor Factorization Approach

Collaborative Filtering with Aspect-based Opinion Mining: A Tensor Factorization Approach 2012 IEEE 12th International Conference on Data Mining Collaborative Filtering with Aspect-based Opinion Mining: A Tensor Factorization Approach Yuanhong Wang,Yang Liu, Xiaohui Yu School of Computer Science

More information

An Effective Tensor Completion Method Based on Multi-linear Tensor Ring Decomposition

An Effective Tensor Completion Method Based on Multi-linear Tensor Ring Decomposition An Effective Tensor Completion Method Based on Multi-linear Tensor Ring Decomposition Jinshi Yu, Guoxu Zhou, Qibin Zhao and Kan Xie School of Automation, Guangdong University of Technology, Guangzhou,

More information

Math 671: Tensor Train decomposition methods

Math 671: Tensor Train decomposition methods Math 671: Eduardo Corona 1 1 University of Michigan at Ann Arbor December 8, 2016 Table of Contents 1 Preliminaries and goal 2 Unfolding matrices for tensorized arrays The Tensor Train decomposition 3

More information

Data Mining and Matrices

Data Mining and Matrices Data Mining and Matrices 11 Tensor Applications Rainer Gemulla, Pauli Miettinen July 11, 2013 Outline 1 Some Tensor Decompositions 2 Applications of Tensor Decompositions 3 Wrap-Up 2 / 30 Tucker s many

More information

FLEXIBLE HALS ALGORITHMS FOR SPARSE NON-NEGATIVE MATRIX/TENSOR FACTORIZATION. RIKEN Brain Science Institute, LABSP, Wako-shi, Saitama , JAPAN

FLEXIBLE HALS ALGORITHMS FOR SPARSE NON-NEGATIVE MATRIX/TENSOR FACTORIZATION. RIKEN Brain Science Institute, LABSP, Wako-shi, Saitama , JAPAN FLEXIBLE HALS ALGORITHMS FOR SPARSE NON-NEGATIVE MATRIX/TENSOR FACTORIZATION Andre CICHOCKI, Anh Huy PHAN and Cesar CAIAFA RIKEN Brain Science Institute, LABSP, Wako-shi, Saitama 351-198, JAPAN ABSTRACT

More information

Latent Semantic Models. Reference: Introduction to Information Retrieval by C. Manning, P. Raghavan, H. Schutze

Latent Semantic Models. Reference: Introduction to Information Retrieval by C. Manning, P. Raghavan, H. Schutze Latent Semantic Models Reference: Introduction to Information Retrieval by C. Manning, P. Raghavan, H. Schutze 1 Vector Space Model: Pros Automatic selection of index terms Partial matching of queries

More information

CS60021: Scalable Data Mining. Dimensionality Reduction

CS60021: Scalable Data Mining. Dimensionality Reduction J. Leskovec, A. Rajaraman, J. Ullman: Mining of Massive Datasets, http://www.mmds.org 1 CS60021: Scalable Data Mining Dimensionality Reduction Sourangshu Bhattacharya Assumption: Data lies on or near a

More information

Tensor Decompositions for Machine Learning. G. Roeder 1. UBC Machine Learning Reading Group, June University of British Columbia

Tensor Decompositions for Machine Learning. G. Roeder 1. UBC Machine Learning Reading Group, June University of British Columbia Network Feature s Decompositions for Machine Learning 1 1 Department of Computer Science University of British Columbia UBC Machine Learning Group, June 15 2016 1/30 Contact information Network Feature

More information

Using Matrix Decompositions in Formal Concept Analysis

Using Matrix Decompositions in Formal Concept Analysis Using Matrix Decompositions in Formal Concept Analysis Vaclav Snasel 1, Petr Gajdos 1, Hussam M. Dahwa Abdulla 1, Martin Polovincak 1 1 Dept. of Computer Science, Faculty of Electrical Engineering and

More information

Latent Semantic Indexing (LSI) CE-324: Modern Information Retrieval Sharif University of Technology

Latent Semantic Indexing (LSI) CE-324: Modern Information Retrieval Sharif University of Technology Latent Semantic Indexing (LSI) CE-324: Modern Information Retrieval Sharif University of Technology M. Soleymani Fall 2014 Most slides have been adapted from: Profs. Manning, Nayak & Raghavan (CS-276,

More information

Available Ph.D position in Big data processing using sparse tensor representations

Available Ph.D position in Big data processing using sparse tensor representations Available Ph.D position in Big data processing using sparse tensor representations Research area: advanced mathematical methods applied to big data processing. Keywords: compressed sensing, tensor models,

More information

High Performance Parallel Tucker Decomposition of Sparse Tensors

High Performance Parallel Tucker Decomposition of Sparse Tensors High Performance Parallel Tucker Decomposition of Sparse Tensors Oguz Kaya INRIA and LIP, ENS Lyon, France SIAM PP 16, April 14, 2016, Paris, France Joint work with: Bora Uçar, CNRS and LIP, ENS Lyon,

More information

A Randomized Approach for Crowdsourcing in the Presence of Multiple Views

A Randomized Approach for Crowdsourcing in the Presence of Multiple Views A Randomized Approach for Crowdsourcing in the Presence of Multiple Views Presenter: Yao Zhou joint work with: Jingrui He - 1 - Roadmap Motivation Proposed framework: M2VW Experimental results Conclusion

More information

GigaTensor: Scaling Tensor Analysis Up By 100 Times - Algorithms and Discoveries

GigaTensor: Scaling Tensor Analysis Up By 100 Times - Algorithms and Discoveries GigaTensor: Scaling Tensor Analysis Up By 100 Times - Algorithms and Discoveries U Kang, Evangelos Papalexakis, Abhay Harpale, Christos Faloutsos Carnegie Mellon University {ukang, epapalex, aharpale,

More information

SPARSE signal representations have gained popularity in recent

SPARSE signal representations have gained popularity in recent 6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying

More information

Linear Algebra Background

Linear Algebra Background CS76A Text Retrieval and Mining Lecture 5 Recap: Clustering Hierarchical clustering Agglomerative clustering techniques Evaluation Term vs. document space clustering Multi-lingual docs Feature selection

More information

Tensor Decomposition Theory and Algorithms in the Era of Big Data

Tensor Decomposition Theory and Algorithms in the Era of Big Data Tensor Decomposition Theory and Algorithms in the Era of Big Data Nikos Sidiropoulos, UMN EUSIPCO Inaugural Lecture, Sep. 2, 2014, Lisbon Nikos Sidiropoulos Tensor Decomposition in the Era of Big Data

More information

Data Compression Techniques for Urban Traffic Data

Data Compression Techniques for Urban Traffic Data Data Compression Techniques for Urban Traffic Data Muhammad Tayyab Asif, Kanan Srinivasan, Justin Dauwels and Patrick Jaillet School of Electrical and Electronic Engineering, Nanyang Technological University,

More information

Math 671: Tensor Train decomposition methods II

Math 671: Tensor Train decomposition methods II Math 671: Tensor Train decomposition methods II Eduardo Corona 1 1 University of Michigan at Ann Arbor December 13, 2016 Table of Contents 1 What we ve talked about so far: 2 The Tensor Train decomposition

More information

Recovery of Low-Rank Plus Compressed Sparse Matrices with Application to Unveiling Traffic Anomalies

Recovery of Low-Rank Plus Compressed Sparse Matrices with Application to Unveiling Traffic Anomalies July 12, 212 Recovery of Low-Rank Plus Compressed Sparse Matrices with Application to Unveiling Traffic Anomalies Morteza Mardani Dept. of ECE, University of Minnesota, Minneapolis, MN 55455 Acknowledgments:

More information

Tensor Factorization Using Auxiliary Information

Tensor Factorization Using Auxiliary Information Tensor Factorization Using Auxiliary Information Atsuhiro Narita 1,KoheiHayashi 2,RyotaTomioka 1, and Hisashi Kashima 1,3 1 Department of Mathematical Informatics, The University of Tokyo, 7-3-1 Hongo,

More information

Matrix Completion for Structured Observations

Matrix Completion for Structured Observations Matrix Completion for Structured Observations Denali Molitor Department of Mathematics University of California, Los ngeles Los ngeles, C 90095, US Email: dmolitor@math.ucla.edu Deanna Needell Department

More information

Slice Oriented Tensor Decomposition of EEG Data for Feature Extraction in Space, Frequency and Time Domains

Slice Oriented Tensor Decomposition of EEG Data for Feature Extraction in Space, Frequency and Time Domains Slice Oriented Tensor Decomposition of EEG Data for Feature Extraction in Space, and Domains Qibin Zhao, Cesar F. Caiafa, Andrzej Cichocki, and Liqing Zhang 2 Laboratory for Advanced Brain Signal Processing,

More information

To be published in Optics Letters: Blind Multi-spectral Image Decomposition by 3D Nonnegative Tensor Title: Factorization Authors: Ivica Kopriva and A

To be published in Optics Letters: Blind Multi-spectral Image Decomposition by 3D Nonnegative Tensor Title: Factorization Authors: Ivica Kopriva and A o be published in Optics Letters: Blind Multi-spectral Image Decomposition by 3D Nonnegative ensor itle: Factorization Authors: Ivica Kopriva and Andrzej Cichocki Accepted: 21 June 2009 Posted: 25 June

More information

Efficient CP-ALS and Reconstruction From CP

Efficient CP-ALS and Reconstruction From CP Efficient CP-ALS and Reconstruction From CP Jed A. Duersch & Tamara G. Kolda Sandia National Laboratories Livermore, CA Sandia National Laboratories is a multimission laboratory managed and operated by

More information

DATA MINING LECTURE 8. Dimensionality Reduction PCA -- SVD

DATA MINING LECTURE 8. Dimensionality Reduction PCA -- SVD DATA MINING LECTURE 8 Dimensionality Reduction PCA -- SVD The curse of dimensionality Real data usually have thousands, or millions of dimensions E.g., web documents, where the dimensionality is the vocabulary

More information

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 Machine Learning for Signal Processing Sparse and Overcomplete Representations Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 1 Key Topics in this Lecture Basics Component-based representations

More information

Tensor Completion for Estimating Missing Values in Visual Data

Tensor Completion for Estimating Missing Values in Visual Data Tensor Completion for Estimating Missing Values in Visual Data Ji Liu, Przemyslaw Musialski 2, Peter Wonka, and Jieping Ye Arizona State University VRVis Research Center 2 Ji.Liu@asu.edu, musialski@vrvis.at,

More information

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012 Machine Learning CSE6740/CS7641/ISYE6740, Fall 2012 Principal Components Analysis Le Song Lecture 22, Nov 13, 2012 Based on slides from Eric Xing, CMU Reading: Chap 12.1, CB book 1 2 Factor or Component

More information

Non-Negative Tensor Factorisation for Sound Source Separation

Non-Negative Tensor Factorisation for Sound Source Separation ISSC 2005, Dublin, Sept. -2 Non-Negative Tensor Factorisation for Sound Source Separation Derry FitzGerald, Matt Cranitch φ and Eugene Coyle* φ Dept. of Electronic Engineering, Cor Institute of Technology

More information

c 2008 Society for Industrial and Applied Mathematics

c 2008 Society for Industrial and Applied Mathematics SIAM J MATRIX ANAL APPL Vol 30, No 3, pp 1219 1232 c 2008 Society for Industrial and Applied Mathematics A JACOBI-TYPE METHOD FOR COMPUTING ORTHOGONAL TENSOR DECOMPOSITIONS CARLA D MORAVITZ MARTIN AND

More information

Turbo-SMT: Accelerating Coupled Sparse Matrix-Tensor Factorizations by 200x

Turbo-SMT: Accelerating Coupled Sparse Matrix-Tensor Factorizations by 200x Turbo-SMT: Accelerating Coupled Sparse Matrix-Tensor Factorizations by 2x Evangelos E. Papalexakis epapalex@cs.cmu.edu Christos Faloutsos christos@cs.cmu.edu Tom M. Mitchell tom.mitchell@cmu.edu Partha

More information

Temporal analysis of semantic graphs using ASALSAN

Temporal analysis of semantic graphs using ASALSAN Seventh IEEE International Conference on Data Mining Temporal analysis of semantic graphs using ASALSAN Brett W. Bader Sandia National Laboratories Albuquerque, NM, USA bwbader@sandia.gov Richard A. Harshman

More information

Nonnegative Matrix Factorization. Data Mining

Nonnegative Matrix Factorization. Data Mining The Nonnegative Matrix Factorization in Data Mining Amy Langville langvillea@cofc.edu Mathematics Department College of Charleston Charleston, SC Yahoo! Research 10/18/2005 Outline Part 1: Historical Developments

More information

Latent Semantic Indexing (LSI) CE-324: Modern Information Retrieval Sharif University of Technology

Latent Semantic Indexing (LSI) CE-324: Modern Information Retrieval Sharif University of Technology Latent Semantic Indexing (LSI) CE-324: Modern Information Retrieval Sharif University of Technology M. Soleymani Fall 2016 Most slides have been adapted from: Profs. Manning, Nayak & Raghavan (CS-276,

More information

Lecture Notes 10: Matrix Factorization

Lecture Notes 10: Matrix Factorization Optimization-based data analysis Fall 207 Lecture Notes 0: Matrix Factorization Low-rank models. Rank- model Consider the problem of modeling a quantity y[i, j] that depends on two indices i and j. To

More information

Decompositions of Higher-Order Tensors: Concepts and Computation

Decompositions of Higher-Order Tensors: Concepts and Computation L. De Lathauwer Decompositions of Higher-Order Tensors: Concepts and Computation Lieven De Lathauwer KU Leuven Belgium Lieven.DeLathauwer@kuleuven-kulak.be 1 L. De Lathauwer Canonical Polyadic Decomposition

More information

arxiv: v1 [cs.na] 29 Sep 2017

arxiv: v1 [cs.na] 29 Sep 2017 Fast online low-rank tensor subspace tracking by CP decomposition using recursive least squares from incomplete observations Hiroyuki Kasai arxiv:79.276v [cs.na 29 Sep 27 May 27, 28 Abstract We consider

More information

Matrix Factorization & Latent Semantic Analysis Review. Yize Li, Lanbo Zhang

Matrix Factorization & Latent Semantic Analysis Review. Yize Li, Lanbo Zhang Matrix Factorization & Latent Semantic Analysis Review Yize Li, Lanbo Zhang Overview SVD in Latent Semantic Indexing Non-negative Matrix Factorization Probabilistic Latent Semantic Indexing Vector Space

More information

Factor Matrix Trace Norm Minimization for Low-Rank Tensor Completion

Factor Matrix Trace Norm Minimization for Low-Rank Tensor Completion Factor Matrix Trace Norm Minimization for Low-Rank Tensor Completion Yuanyuan Liu Fanhua Shang Hong Cheng James Cheng Hanghang Tong Abstract Most existing low-n-rank imization algorithms for tensor completion

More information

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin 1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)

More information

A PARALLEL ALGORITHM FOR BIG TENSOR DECOMPOSITION USING RANDOMLY COMPRESSED CUBES (PARACOMP)

A PARALLEL ALGORITHM FOR BIG TENSOR DECOMPOSITION USING RANDOMLY COMPRESSED CUBES (PARACOMP) A PARALLEL ALGORTHM FOR BG TENSOR DECOMPOSTON USNG RANDOMLY COMPRESSED CUBES PARACOMP N.D. Sidiropoulos Dept. of ECE, Univ. of Minnesota Minneapolis, MN 55455, USA E.E. Papalexakis, and C. Faloutsos Dept.

More information

Theoretical Performance Analysis of Tucker Higher Order SVD in Extracting Structure from Multiple Signal-plus-Noise Matrices

Theoretical Performance Analysis of Tucker Higher Order SVD in Extracting Structure from Multiple Signal-plus-Noise Matrices Theoretical Performance Analysis of Tucker Higher Order SVD in Extracting Structure from Multiple Signal-plus-Noise Matrices Himanshu Nayar Dept. of EECS University of Michigan Ann Arbor Michigan 484 email:

More information

CS168: The Modern Algorithmic Toolbox Lecture #18: Linear and Convex Programming, with Applications to Sparse Recovery

CS168: The Modern Algorithmic Toolbox Lecture #18: Linear and Convex Programming, with Applications to Sparse Recovery CS168: The Modern Algorithmic Toolbox Lecture #18: Linear and Convex Programming, with Applications to Sparse Recovery Tim Roughgarden & Gregory Valiant May 25, 2016 1 The Story So Far Recall the setup

More information