Random Sampling of Bandlimited Signals on Graphs

Size: px
Start display at page:

Download "Random Sampling of Bandlimited Signals on Graphs"

Transcription

1 Random Sampling of Bandlimited Signals on Graphs Pierre Vandergheynst École Polytechnique Fédérale de Lausanne (EPFL) School of Engineering & School of Computer and Communication Sciences Joint work with Gilles Puy (INRIA), Nicolas Tremblay (INRIA) and Rémi Gribonval (INRIA) NIPS015 Workshop Multiresolution Methods for Large Scale Learning 1

2 Motivation Energy Networks Social Networks Transportation Networks Biological Networks Point Clouds

3 Goal Given partially observed information at the nodes of a graph? Can we robustly and efficiently infer missing information? What signal model? How many observations? Influence of the structure of the graph? 3

4 Notations G = {V, E, W} weighted, undirected V is the set of n nodes E is the set of edges W R n n is the weighted adjacency matrix L R n n combinatorial graph Laplacian L := D normalised Laplacian L := I D 1/ WD 1/ W diagonal degree matrix D has entries d i := P i6=j W ij 4

5 Notations L is real, symmetric PSD orthonormal eigenvectors U R n n Graph Fourier Matrix non-negative eigenvalues , n L = U U 5

6 Notations L is real, symmetric PSD orthonormal eigenvectors U R n n Graph Fourier Matrix non-negative eigenvalues , n L = U U k-bandlimited signals Fourier coefficients x = U k ˆx k x R n ˆx = U x ˆx k R k U k := (u 1,...,u k ) R n k first k eigenvectors only 5

7 Sampling Model nx p R n p i > 0 kpk 1 = P := diag(p) R n n i=1 p i =1 6

8 Sampling Model nx p R n p i > 0 kpk 1 = P := diag(p) R n n i=1 p i =1 Draw independently m samples (random sampling) P(! j = i) =p i, 8j {1,...,m} and 8i {1,...,n} 6

9 Sampling Model nx p R n p i > 0 kpk 1 = P := diag(p) R n n i=1 p i =1 Draw independently m samples (random sampling) P(! j = i) =p i, 8j {1,...,m} and 8i {1,...,n} y j := x!j, 8j {1,...,m} y = Mx 6

10 Sampling Model ku k ik ku ik = ku k ik k i k = ku k ik How much a perfect impulse can be concentrated on first k eigenvectors Carries interesting information about the graph 7

11 Sampling Model ku k ik ku ik = ku k ik k i k = ku k ik How much a perfect impulse can be concentrated on first k eigenvectors Carries interesting information about the graph Ideally: p i large wherever ku k ik is large 7

12 Sampling Model ku k ik ku ik = ku k ik k i k = ku k ik How much a perfect impulse can be concentrated on first k eigenvectors Carries interesting information about the graph Ideally: p i large wherever ku k ik is large Graph Coherence k p := max 16i6n Rem: k p n p 1/ i ku k ik o > p k 7

13 Stable Embedding Theorem 1 (Restricted isometry property). Let M be a random subsampling matrix with the sampling distribution p. Forany, (0, 1), with probability at least 1, (1 ) kxk 6 1 m MP 1/ x 6 (1 + ) kxk (1) for all x span(u k ) provided that m > 3 k ( k p) log. () 8

14 Stable Embedding Theorem 1 (Restricted isometry property). Let M be a random subsampling matrix with the sampling distribution p. Forany, (0, 1), with probability at least 1, (1 ) kxk 6 1 m MP 1/ x 6 (1 + ) kxk (1) for all x span(u k ) provided that m > 3 k ( k p) log. () MP 1/ x = P 1/ Mx Only need M, re-weighting offline 8

15 Stable Embedding Theorem 1 (Restricted isometry property). Let M be a random subsampling matrix with the sampling distribution p. Forany, (0, 1), with probability at least 1, (1 ) kxk 6 1 m MP 1/ x 6 (1 + ) kxk (1) for all x span(u k ) provided that m > 3 k ( k p) log. () MP 1/ x = P 1/ Mx Only need M, re-weighting offline ( k p) > k Need to sample at least k nodes 8

16 Stable Embedding Theorem 1 (Restricted isometry property). Let M be a random subsampling matrix with the sampling distribution p. Forany, (0, 1), with probability at least 1, (1 ) kxk 6 1 m MP 1/ x 6 (1 + ) kxk (1) for all x span(u k ) provided that m > 3 k ( k p) log. () MP 1/ x = P 1/ Mx Only need M, re-weighting offline ( k p) > k Need to sample at least k nodes Proof similar to CS in bounded ONB but simpler since model is a subspace (not a union) 8

17 Stable Embedding ( k p) > k Need to sample at least k nodes 9

18 Stable Embedding ( k p) > k Need to sample at least k nodes Can we reduce to optimal amount? 9

19 Stable Embedding ( k p) > k Need to sample at least k nodes Can we reduce to optimal amount? Variable Density Sampling p i := ku k ik k, i =1,...,n is such that: ( k p) = k and depends on structure of graph 9

20 Stable Embedding ( k p) > k Need to sample at least k nodes Can we reduce to optimal amount? Variable Density Sampling p i := ku k ik k, i =1,...,n is such that: ( k p) = k and depends on structure of graph Corollary 1. Let M be a random subsampling matrix constructed with the sampling distribution p.forany, (0, 1), with probability at least 1, (1 ) kxk 6 1 m MP 1/ x 6 (1 + ) kxk for all x span(u k ) provided that m > 3 k log k. 9

21 Recovery Procedures y = Mx + n y R m x span(u k ) stable embedding 10

22 Recovery Procedures y = Mx + n y R m x span(u k ) stable embedding Standard Decoder min zspan(u k ) P 1/ (Mz y) 10

23 Recovery Procedures y = Mx + n y R m x span(u k ) stable embedding Standard Decoder min zspan(u k ) P 1/ (Mz y) need projector 10

24 Recovery Procedures y = Mx + n y R m x span(u k ) stable embedding Standard Decoder min zspan(u k ) P 1/ (Mz y) need projector re-weighting for RIP 10

25 Recovery Procedures y = Mx + n y R m x span(u k ) stable embedding 11

26 Recovery Procedures y = Mx + n y R m x span(u k ) stable embedding Efficient Decoder: min zr n P 1/ (Mz y) + z g(l)z 11

27 Recovery Procedures y = Mx + n y R m x span(u k ) stable embedding Efficient Decoder: min P 1/ zr n (Mz y) + z g(l)z soft constrain on frequencies efficient implementation 11

28 Analysis of Standard Decoder Standard Decoder: min zspan(u k ) P 1/ (Mz y) 1

29 Analysis of Standard Decoder Standard Decoder: min zspan(u k ) P 1/ (Mz y) Theorem 1. Let be a set of m indices selected independently from {1,...,n} with sampling distribution p R n,andm the associated sampling matrix. Let, (0, 1) and m > 3 ( k p) log k. With probability at least 1, the following holds for all x span(u k ) and all n R m. i) Let x be the solution of Standard Decoder with y = Mx + n. Then, kx xk 6 p P 1/ n m (1 ). (1) ii) There exist particular vectors n 0 R m such that the solution x of Standard Decoder with y = Mx + n 0 satisfies kx xk > 1 p m (1 + ) P 1/ n 0. () 1

30 Analysis of Standard Decoder Standard Decoder: min zspan(u k ) P 1/ (Mz y) Theorem 1. Let be a set of m indices selected independently from {1,...,n} with sampling distribution p R n,andm the associated sampling matrix. Let, (0, 1) and m > 3 ( k p) log k. With probability at least 1, the following holds for all x span(u k ) and all n R m. i) Let x be the solution of Standard Decoder with y = Mx + n. Then, kx xk 6 Exact recovery when noiseless p P 1/ n m (1 ). (1) ii) There exist particular vectors n 0 R m such that the solution x of Standard Decoder with y = Mx + n 0 satisfies kx xk > 1 p m (1 + ) P 1/ n 0. () 1

31 Analysis of Standard Decoder Standard Decoder: min zspan(u k ) P 1/ (Mz y) Theorem 1. Let be a set of m indices selected independently from {1,...,n} with sampling distribution p R n,andm the associated sampling matrix. Let, (0, 1) and m > 3 ( k p) log k. With probability at least 1, the following holds for all x span(u k ) and all n R m. i) Let x be the solution of Standard Decoder with y = Mx + n. Then, kx xk 6 Exact recovery when noiseless p P 1/ n m (1 ). (1) ii) There exist particular vectors n 0 R m such that the solution x of Standard Decoder with y = Mx + n 0 satisfies kx xk > 1 p m (1 + ) P 1/ n 0. () 1

32 Analysis of Efficient Decoder Efficient Decoder: min P 1/ zr n (Mz y) + z g(l)z non-negative 13

33 Analysis of Efficient Decoder Efficient Decoder: min P 1/ zr n (Mz y) + z g(l)z Filter reshapes Fourier coefficients h : R! R x h := U diag(ĥ) U x R n non-negative ĥ =(h( 1 ),...,h( n )) R n 13

34 Analysis of Efficient Decoder Efficient Decoder: min P 1/ zr n (Mz y) + z g(l)z Filter reshapes Fourier coefficients h : R! R x h := U diag(ĥ) U x R n non-negative ĥ =(h( 1 ),...,h( n )) R n p(t) = dx i t i x p = U diag( ˆp) U x = i=0 dx i L i x i=0 Pick special polynomials and use e.g. recurrence relations for fast filtering (with sparse matrix-vector multiply only) 13

35 Analysis of Efficient Decoder Efficient Decoder: min P 1/ zr n (Mz y) + z g(l)z non-negative non-decreasing = penalizes high-frequencies 14

36 Analysis of Efficient Decoder Efficient Decoder: min P 1/ zr n (Mz y) + z g(l)z non-negative non-decreasing = penalizes high-frequencies Favours reconstruction of approximately band-limited signals Ideal filter yields Standard Decoder i k (t) := 0 if t [0, k], +1 otherwise, 14

37 Analysis of Efficient Decoder Theorem 1. Let, M, P, m as before and M max > 0 be a constant such that MP 1/ 6 M max. Let, (0, 1). With probability at least 1, the following holds for all x span(u k ),allnr n,all > 0, and all nonnegative and nondecreasing polynomial functions g such that g( k+1 ) > 0. Let x be the solution of E cient Decoder with y = Mx + n. Then, "! k 1 M xk 6 p + p max P 1/ n m(1 ) g( k+1 ) s g( k ) + M max g( k+1 ) + p! # g( k ) kxk, (1) and k k 6 1 p g( k+1 ) P 1/ n + s g( k ) g( k+1 ) kxk, () where := U k U k x and := (I U k U k ) x. 15

38 Analysis of Efficient Decoder Noiseless case: x xk 6 s 1 g( p k ) M max m(1 ) g( k+1 ) + p! g( k ) kxk + s g( k ) g( k+1 ) kxk g( k )=0 + non-decreasing implies perfect reconstruction 16

39 Analysis of Efficient Decoder Noiseless case: x xk 6 s 1 g( p k ) M max m(1 ) g( k+1 ) + p! g( k ) kxk + s g( k ) g( k+1 ) kxk g( k )=0 + non-decreasing implies perfect reconstruction Otherwise: choose as close as possible to 0 and seek to minimise the ratio g( k )/g( k+1 ) Noise: kp 1/ nk / kxk Choose filter to increase spectral gap? Clusters are of course good 16

40 Estimating the Optimal Distribution 17

41 Estimating the Optimal Distribution Need to estimate ku k ik 17

42 Estimating the Optimal Distribution Need to estimate ku k ik Filter random signals with ideal low-pass filter: r b k = U diag( 1,..., k, 0,...,0) U r = U k U k r E (r b k ) i = i U ku k E(rr ) U k U k i = ku k ik 17

43 Estimating the Optimal Distribution Need to estimate ku k ik Filter random signals with ideal low-pass filter: r b k = U diag( 1,..., k, 0,...,0) U r = U k U k r E (r b k ) i = i U ku k E(rr ) U k U k i = ku k ik In practice, one may use a polynomial approximation of the ideal filter and: p i := P n i=1 P L l=1 (rl c k ) i P L l=1 (rl c k ) i L > C log n 17

44 Estimating the Eigengap 18

45 Estimating the Eigengap Again, low-pass filtering random signals: (1 ) nx i=1 U j i 6 X n i=1 LX l=1 (r l b ) i 6 (1 + ) nx i=1 U j i 18

46 Estimating the Eigengap Again, low-pass filtering random signals: (1 ) Since: nx i=1 nx i=1 U j i U j 6 X n i i=1 LX l=1 = ku j k Frob = j (r l b ) i 6 (1 + ) nx i=1 U j i We have: (1 ) j 6 nx i=1 LX l=1 (r l b ) i 6 (1 + ) j Dichotomy using the filter bandwidth 18

47 Experiments unbalanced clusters 19

48 Experiments 0

49 Experiments 1

50 Experiments 7%

51 Compressive Spectral Clustering Clustering equivalent to recovery of cluster assignment functions Well-defined clusters -> band-limited assignment functions! 3

52 Compressive Spectral Clustering Clustering equivalent to recovery of cluster assignment functions Well-defined clusters -> band-limited assignment functions! Generate features by filtering random signals by Johnson-Lindenstrauss = 4+ / 3 /3 log n 3

53 Compressive Spectral Clustering Clustering equivalent to recovery of cluster assignment functions Well-defined clusters -> band-limited assignment functions! Generate features by filtering random signals by Johnson-Lindenstrauss = 4+ / 3 /3 log n Each feature map is smooth, therefore keep m > 6 k k log 0 Use k-means on compressed data and feed into Efficient Decoder 4

54 Compressive Spectral Clustering log k k log k 5

55 Conclusion 6

56 Conclusion Stable, robust and universal random sampling of smoothly varying information on graphs. 6

57 Conclusion Stable, robust and universal random sampling of smoothly varying information on graphs. Tractable decoder with guarantees 6

58 Conclusion Stable, robust and universal random sampling of smoothly varying information on graphs. Tractable decoder with guarantees Optimal sampling distribution depends on graph structure 6

59 Conclusion Stable, robust and universal random sampling of smoothly varying information on graphs. Tractable decoder with guarantees Optimal sampling distribution depends on graph structure Can be used for inference, (SVD less) compressive clustering 6

60 Thank you! 7

Sampling, Inference and Clustering for Data on Graphs

Sampling, Inference and Clustering for Data on Graphs Sampling, Inference and Clustering for Data on Graphs Pierre Vandergheynst École Polytechnique Fédérale de Lausanne (EPFL) School of Engineering & School of Computer and Communication Sciences Joint work

More information

Random sampling of bandlimited signals on graphs

Random sampling of bandlimited signals on graphs Random sampling of bandlimited signals on graphs Gilles Puy, Nicolas Tremblay, Rémi Gribonval, Pierre Vandergheynst To cite this version: Gilles Puy, Nicolas Tremblay, Rémi Gribonval, Pierre Vandergheynst

More information

Random sampling of bandlimited signals on graphs

Random sampling of bandlimited signals on graphs Random sampling of bandlimited signals on graphs Gilles Puy, Nicolas Tremblay, Rémi Gribonval, Pierre Vandergheynst To cite this version: Gilles Puy, Nicolas Tremblay, Rémi Gribonval, Pierre Vandergheynst

More information

Filtering and Sampling Graph Signals, and its Application to Compressive Spectral Clustering

Filtering and Sampling Graph Signals, and its Application to Compressive Spectral Clustering Filtering and Sampling Graph Signals, and its Application to Compressive Spectral Clustering Nicolas Tremblay (1,2), Gilles Puy (1), Rémi Gribonval (1), Pierre Vandergheynst (1,2) (1) PANAMA Team, INRIA

More information

Graph Partitioning Using Random Walks

Graph Partitioning Using Random Walks Graph Partitioning Using Random Walks A Convex Optimization Perspective Lorenzo Orecchia Computer Science Why Spectral Algorithms for Graph Problems in practice? Simple to implement Can exploit very efficient

More information

Machine Learning for Data Science (CS4786) Lecture 11

Machine Learning for Data Science (CS4786) Lecture 11 Machine Learning for Data Science (CS4786) Lecture 11 Spectral clustering Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ ANNOUNCEMENT 1 Assignment P1 the Diagnostic assignment 1 will

More information

arxiv: v1 [cs.it] 26 Sep 2018

arxiv: v1 [cs.it] 26 Sep 2018 SAPLING THEORY FOR GRAPH SIGNALS ON PRODUCT GRAPHS Rohan A. Varma, Carnegie ellon University rohanv@andrew.cmu.edu Jelena Kovačević, NYU Tandon School of Engineering jelenak@nyu.edu arxiv:809.009v [cs.it]

More information

Bayesian Methods for Sparse Signal Recovery

Bayesian Methods for Sparse Signal Recovery Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Jason Palmer, Zhilin Zhang and Ritwik Giri Motivation Motivation Sparse Signal Recovery

More information

INTERPOLATION OF GRAPH SIGNALS USING SHIFT-INVARIANT GRAPH FILTERS

INTERPOLATION OF GRAPH SIGNALS USING SHIFT-INVARIANT GRAPH FILTERS INTERPOLATION OF GRAPH SIGNALS USING SHIFT-INVARIANT GRAPH FILTERS Santiago Segarra, Antonio G. Marques, Geert Leus, Alejandro Ribeiro University of Pennsylvania, Dept. of Electrical and Systems Eng.,

More information

COMPSCI 514: Algorithms for Data Science

COMPSCI 514: Algorithms for Data Science COMPSCI 514: Algorithms for Data Science Arya Mazumdar University of Massachusetts at Amherst Fall 2018 Lecture 8 Spectral Clustering Spectral clustering Curse of dimensionality Dimensionality Reduction

More information

Super-resolution via Convex Programming

Super-resolution via Convex Programming Super-resolution via Convex Programming Carlos Fernandez-Granda (Joint work with Emmanuel Candès) Structure and Randomness in System Identication and Learning, IPAM 1/17/2013 1/17/2013 1 / 44 Index 1 Motivation

More information

Cardinality-Constrained Clustering and Outlier Detection via Conic Optimization

Cardinality-Constrained Clustering and Outlier Detection via Conic Optimization Computational Management Science 2017 Cardinality-Constrained Clustering and Outlier Detection via Conic Optimization Presented by Kilian Schindler École Polytechnique Fédérale de Lausanne Joint work with

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

Discrete Signal Processing on Graphs: Sampling Theory

Discrete Signal Processing on Graphs: Sampling Theory IEEE TRANS. SIGNAL PROCESS. TO APPEAR. 1 Discrete Signal Processing on Graphs: Sampling Theory Siheng Chen, Rohan Varma, Aliaksei Sandryhaila, Jelena Kovačević arxiv:153.543v [cs.it] 8 Aug 15 Abstract

More information

LECTURE NOTE #11 PROF. ALAN YUILLE

LECTURE NOTE #11 PROF. ALAN YUILLE LECTURE NOTE #11 PROF. ALAN YUILLE 1. NonLinear Dimension Reduction Spectral Methods. The basic idea is to assume that the data lies on a manifold/surface in D-dimensional space, see figure (1) Perform

More information

1 Matrix notation and preliminaries from spectral graph theory

1 Matrix notation and preliminaries from spectral graph theory Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.

More information

Chapter 4. Signed Graphs. Intuitively, in a weighted graph, an edge with a positive weight denotes similarity or proximity of its endpoints.

Chapter 4. Signed Graphs. Intuitively, in a weighted graph, an edge with a positive weight denotes similarity or proximity of its endpoints. Chapter 4 Signed Graphs 4.1 Signed Graphs and Signed Laplacians Intuitively, in a weighted graph, an edge with a positive weight denotes similarity or proximity of its endpoints. For many reasons, it is

More information

Digraph Fourier Transform via Spectral Dispersion Minimization

Digraph Fourier Transform via Spectral Dispersion Minimization Digraph Fourier Transform via Spectral Dispersion Minimization Gonzalo Mateos Dept. of Electrical and Computer Engineering University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/

More information

Data Mining Techniques

Data Mining Techniques Data Mining Techniques CS 6220 - Section 3 - Fall 2016 Lecture 12 Jan-Willem van de Meent (credit: Yijun Zhao, Percy Liang) DIMENSIONALITY REDUCTION Borrowing from: Percy Liang (Stanford) Linear Dimensionality

More information

A Statistical Look at Spectral Graph Analysis. Deep Mukhopadhyay

A Statistical Look at Spectral Graph Analysis. Deep Mukhopadhyay A Statistical Look at Spectral Graph Analysis Deep Mukhopadhyay Department of Statistics, Temple University Office: Speakman 335 deep@temple.edu http://sites.temple.edu/deepstat/ Graph Signal Processing

More information

Sparse Solutions of an Undetermined Linear System

Sparse Solutions of an Undetermined Linear System 1 Sparse Solutions of an Undetermined Linear System Maddullah Almerdasy New York University Tandon School of Engineering arxiv:1702.07096v1 [math.oc] 23 Feb 2017 Abstract This work proposes a research

More information

Lecture 12 : Graph Laplacians and Cheeger s Inequality

Lecture 12 : Graph Laplacians and Cheeger s Inequality CPS290: Algorithmic Foundations of Data Science March 7, 2017 Lecture 12 : Graph Laplacians and Cheeger s Inequality Lecturer: Kamesh Munagala Scribe: Kamesh Munagala Graph Laplacian Maybe the most beautiful

More information

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin 1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)

More information

Learning Parametric Dictionaries for. Signals on Graphs

Learning Parametric Dictionaries for. Signals on Graphs Learning Parametric Dictionaries for 1 Signals on Graphs Dorina Thanou, David I Shuman, and Pascal Frossard arxiv:141.887v1 [cs.lg] 5 Jan 214 Abstract In sparse signal representation, the choice of a dictionary

More information

1 Matrix notation and preliminaries from spectral graph theory

1 Matrix notation and preliminaries from spectral graph theory Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.

More information

MLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT

MLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT MLCC 2018 - Clustering Lorenzo Rosasco UNIGE-MIT-IIT About this class We will consider an unsupervised setting, and in particular the problem of clustering unlabeled data into coherent groups. MLCC 2018

More information

Network Topology Inference from Non-stationary Graph Signals

Network Topology Inference from Non-stationary Graph Signals Network Topology Inference from Non-stationary Graph Signals Rasoul Shafipour Dept. of Electrical and Computer Engineering University of Rochester rshafipo@ece.rochester.edu http://www.ece.rochester.edu/~rshafipo/

More information

Inverse problems, Dictionary based Signal Models and Compressed Sensing

Inverse problems, Dictionary based Signal Models and Compressed Sensing Inverse problems, Dictionary based Signal Models and Compressed Sensing Rémi Gribonval METISS project-team (audio signal processing, speech recognition, source separation) INRIA, Rennes, France Ecole d

More information

Spectra of Adjacency and Laplacian Matrices

Spectra of Adjacency and Laplacian Matrices Spectra of Adjacency and Laplacian Matrices Definition: University of Alicante (Spain) Matrix Computing (subject 3168 Degree in Maths) 30 hours (theory)) + 15 hours (practical assignment) Contents 1. Spectra

More information

Networks and Their Spectra

Networks and Their Spectra Networks and Their Spectra Victor Amelkin University of California, Santa Barbara Department of Computer Science victor@cs.ucsb.edu December 4, 2017 1 / 18 Introduction Networks (= graphs) are everywhere.

More information

An algebraic perspective on integer sparse recovery

An algebraic perspective on integer sparse recovery An algebraic perspective on integer sparse recovery Lenny Fukshansky Claremont McKenna College (joint work with Deanna Needell and Benny Sudakov) Combinatorics Seminar USC October 31, 2018 From Wikipedia:

More information

Recovering overcomplete sparse representations from structured sensing

Recovering overcomplete sparse representations from structured sensing Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix

More information

arxiv: v3 [stat.ml] 29 May 2018

arxiv: v3 [stat.ml] 29 May 2018 On Consistency of Compressive Spectral Clustering On Consistency of Compressive Spectral Clustering arxiv:1702.03522v3 [stat.ml] 29 May 2018 Muni Sreenivas Pydi Department of Electrical and Computer Engineering

More information

CS281 Section 4: Factor Analysis and PCA

CS281 Section 4: Factor Analysis and PCA CS81 Section 4: Factor Analysis and PCA Scott Linderman At this point we have seen a variety of machine learning models, with a particular emphasis on models for supervised learning. In particular, we

More information

Compressed Sensing and Neural Networks

Compressed Sensing and Neural Networks and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications

More information

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011 Compressed Sensing Huichao Xue CS3750 Fall 2011 Table of Contents Introduction From News Reports Abstract Definition How it works A review of L 1 norm The Algorithm Backgrounds for underdetermined linear

More information

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline

More information

Unsupervised Machine Learning and Data Mining. DS 5230 / DS Fall Lecture 7. Jan-Willem van de Meent

Unsupervised Machine Learning and Data Mining. DS 5230 / DS Fall Lecture 7. Jan-Willem van de Meent Unsupervised Machine Learning and Data Mining DS 5230 / DS 4420 - Fall 2018 Lecture 7 Jan-Willem van de Meent DIMENSIONALITY REDUCTION Borrowing from: Percy Liang (Stanford) Dimensionality Reduction Goal:

More information

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University PRINCIPAL COMPONENT ANALYSIS DIMENSIONALITY

More information

Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods

Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods Spectral Clustering Seungjin Choi Department of Computer Science POSTECH, Korea seungjin@postech.ac.kr 1 Spectral methods Spectral Clustering? Methods using eigenvectors of some matrices Involve eigen-decomposition

More information

Reconstruction of Graph Signals through Percolation from Seeding Nodes

Reconstruction of Graph Signals through Percolation from Seeding Nodes Reconstruction of Graph Signals through Percolation from Seeding Nodes Santiago Segarra, Antonio G. Marques, Geert Leus, and Alejandro Ribeiro arxiv:507.08364v [cs.si] 30 Jul 05 Abstract New schemes to

More information

Spectral Algorithms I. Slides based on Spectral Mesh Processing Siggraph 2010 course

Spectral Algorithms I. Slides based on Spectral Mesh Processing Siggraph 2010 course Spectral Algorithms I Slides based on Spectral Mesh Processing Siggraph 2010 course Why Spectral? A different way to look at functions on a domain Why Spectral? Better representations lead to simpler solutions

More information

ORIE 4741: Learning with Big Messy Data. Spectral Graph Theory

ORIE 4741: Learning with Big Messy Data. Spectral Graph Theory ORIE 4741: Learning with Big Messy Data Spectral Graph Theory Mika Sumida Operations Research and Information Engineering Cornell September 15, 2017 1 / 32 Outline Graph Theory Spectral Graph Theory Laplacian

More information

Sketching for Large-Scale Learning of Mixture Models

Sketching for Large-Scale Learning of Mixture Models Sketching for Large-Scale Learning of Mixture Models Nicolas Keriven Université Rennes 1, Inria Rennes Bretagne-atlantique Adv. Rémi Gribonval Outline Introduction Practical Approach Results Theoretical

More information

Lecture 10: Dimension Reduction Techniques

Lecture 10: Dimension Reduction Techniques Lecture 10: Dimension Reduction Techniques Radu Balan Department of Mathematics, AMSC, CSCAMM and NWC University of Maryland, College Park, MD April 17, 2018 Input Data It is assumed that there is a set

More information

Chris Bishop s PRML Ch. 8: Graphical Models

Chris Bishop s PRML Ch. 8: Graphical Models Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular

More information

The Analysis Cosparse Model for Signals and Images

The Analysis Cosparse Model for Signals and Images The Analysis Cosparse Model for Signals and Images Raja Giryes Computer Science Department, Technion. The research leading to these results has received funding from the European Research Council under

More information

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture

More information

Reconstruction of graph signals: percolation from a single seeding node

Reconstruction of graph signals: percolation from a single seeding node Reconstruction of graph signals: percolation from a single seeding node Santiago Segarra, Antonio G. Marques, Geert Leus, and Alejandro Ribeiro Abstract Schemes to reconstruct signals defined in the nodes

More information

Unsupervised dimensionality reduction

Unsupervised dimensionality reduction Unsupervised dimensionality reduction Guillaume Obozinski Ecole des Ponts - ParisTech SOCN course 2014 Guillaume Obozinski Unsupervised dimensionality reduction 1/30 Outline 1 PCA 2 Kernel PCA 3 Multidimensional

More information

Solutions to Review Problems for Chapter 6 ( ), 7.1

Solutions to Review Problems for Chapter 6 ( ), 7.1 Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,

More information

CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles

CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles SIAM Student Research Conference Deanna Needell Joint work with Roman Vershynin and Joel Tropp UC Davis, May 2008 CoSaMP: Greedy Signal

More information

Graph sampling with determinantal processes

Graph sampling with determinantal processes Graph sampling with determinantal processes Nicolas Tremblay, Pierre-Olivier Amblard, Simon Barthelme To cite this version: Nicolas Tremblay, Pierre-Olivier Amblard, Simon Barthelme. Graph sampling with

More information

Communities, Spectral Clustering, and Random Walks

Communities, Spectral Clustering, and Random Walks Communities, Spectral Clustering, and Random Walks David Bindel Department of Computer Science Cornell University 26 Sep 2011 20 21 19 16 22 28 17 18 29 26 27 30 23 1 25 5 8 24 2 4 14 3 9 13 15 11 10 12

More information

Markov Chains, Random Walks on Graphs, and the Laplacian

Markov Chains, Random Walks on Graphs, and the Laplacian Markov Chains, Random Walks on Graphs, and the Laplacian CMPSCI 791BB: Advanced ML Sridhar Mahadevan Random Walks! There is significant interest in the problem of random walks! Markov chain analysis! Computer

More information

Markov Chains and Spectral Clustering

Markov Chains and Spectral Clustering Markov Chains and Spectral Clustering Ning Liu 1,2 and William J. Stewart 1,3 1 Department of Computer Science North Carolina State University, Raleigh, NC 27695-8206, USA. 2 nliu@ncsu.edu, 3 billy@ncsu.edu

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional

More information

CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory

CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory Tim Roughgarden & Gregory Valiant May 2, 2016 Spectral graph theory is the powerful and beautiful theory that arises from

More information

On the coherence barrier and analogue problems in compressed sensing

On the coherence barrier and analogue problems in compressed sensing On the coherence barrier and analogue problems in compressed sensing Clarice Poon University of Cambridge June 1, 2017 Joint work with: Ben Adcock Anders Hansen Bogdan Roman (Simon Fraser) (Cambridge)

More information

Fast and Robust Phase Retrieval

Fast and Robust Phase Retrieval Fast and Robust Phase Retrieval Aditya Viswanathan aditya@math.msu.edu CCAM Lunch Seminar Purdue University April 18 2014 0 / 27 Joint work with Yang Wang Mark Iwen Research supported in part by National

More information

A New Estimate of Restricted Isometry Constants for Sparse Solutions

A New Estimate of Restricted Isometry Constants for Sparse Solutions A New Estimate of Restricted Isometry Constants for Sparse Solutions Ming-Jun Lai and Louis Y. Liu January 12, 211 Abstract We show that as long as the restricted isometry constant δ 2k < 1/2, there exist

More information

Discrete Signal Processing on Graphs: Sampling Theory

Discrete Signal Processing on Graphs: Sampling Theory SUBMITTED TO IEEE TRANS. SIGNAL PROCESS., FEB 5. Discrete Signal Processing on Graphs: Sampling Theory Siheng Chen, Rohan Varma, Aliaksei Sandryhaila, Jelena Kovačević arxiv:53.543v [cs.it] 3 Mar 5 Abstract

More information

Compressive Sensing and Beyond

Compressive Sensing and Beyond Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered

More information

Convergence of Eigenspaces in Kernel Principal Component Analysis

Convergence of Eigenspaces in Kernel Principal Component Analysis Convergence of Eigenspaces in Kernel Principal Component Analysis Shixin Wang Advanced machine learning April 19, 2016 Shixin Wang Convergence of Eigenspaces April 19, 2016 1 / 18 Outline 1 Motivation

More information

Statistical Pattern Recognition

Statistical Pattern Recognition Statistical Pattern Recognition Feature Extraction Hamid R. Rabiee Jafar Muhammadi, Alireza Ghasemi, Payam Siyari Spring 2014 http://ce.sharif.edu/courses/92-93/2/ce725-2/ Agenda Dimensionality Reduction

More information

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method CS 395T: Sublinear Algorithms Fall 2016 Prof. Eric Price Lecture 13 October 6, 2016 Scribe: Kiyeon Jeon and Loc Hoang 1 Overview In the last lecture we covered the lower bound for p th moment (p > 2) and

More information

Uncertainty Principle and Sampling of Signals Defined on Graphs

Uncertainty Principle and Sampling of Signals Defined on Graphs Uncertainty Principle and Sampling of Signals Defined on Graphs Mikhail Tsitsvero, Sergio Barbarossa, and Paolo Di Lorenzo 2 Department of Information Eng., Electronics and Telecommunications, Sapienza

More information

Learning graphs from data:

Learning graphs from data: Learning graphs from data: A signal processing perspective Xiaowen Dong MIT Media Lab Graph Signal Processing Workshop Pittsburgh, PA, May 2017 Introduction What is the problem of graph learning? 2/34

More information

Community detection in stochastic block models via spectral methods

Community detection in stochastic block models via spectral methods Community detection in stochastic block models via spectral methods Laurent Massoulié (MSR-Inria Joint Centre, Inria) based on joint works with: Dan Tomozei (EPFL), Marc Lelarge (Inria), Jiaming Xu (UIUC),

More information

Compressed Sensing: Extending CLEAN and NNLS

Compressed Sensing: Extending CLEAN and NNLS Compressed Sensing: Extending CLEAN and NNLS Ludwig Schwardt SKA South Africa (KAT Project) Calibration & Imaging Workshop Socorro, NM, USA 31 March 2009 Outline 1 Compressed Sensing (CS) Introduction

More information

Learning from Sensor Data: Set II. Behnaam Aazhang J.S. Abercombie Professor Electrical and Computer Engineering Rice University

Learning from Sensor Data: Set II. Behnaam Aazhang J.S. Abercombie Professor Electrical and Computer Engineering Rice University Learning from Sensor Data: Set II Behnaam Aazhang J.S. Abercombie Professor Electrical and Computer Engineering Rice University 1 6. Data Representation The approach for learning from data Probabilistic

More information

Rémi Gribonval Inria Rennes - Bretagne Atlantique.

Rémi Gribonval Inria Rennes - Bretagne Atlantique. Rémi Gribonval Inria Rennes - Bretagne Atlantique remi.gribonval@inria.fr Contributors & Collaborators Anthony Bourrier Nicolas Keriven Yann Traonmilin Gilles Puy Gilles Blanchard Mike Davies Tomer Peleg

More information

CS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5

CS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5 CS 229r: Algorithms for Big Data Fall 215 Prof. Jelani Nelson Lecture 19 Nov 5 Scribe: Abdul Wasay 1 Overview In the last lecture, we started discussing the problem of compressed sensing where we are given

More information

Compressed Sensing: Lecture I. Ronald DeVore

Compressed Sensing: Lecture I. Ronald DeVore Compressed Sensing: Lecture I Ronald DeVore Motivation Compressed Sensing is a new paradigm for signal/image/function acquisition Motivation Compressed Sensing is a new paradigm for signal/image/function

More information

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Shuyang Ling Courant Institute of Mathematical Sciences, NYU Aug 13, 2018 Joint

More information

FuncICA for time series pattern discovery

FuncICA for time series pattern discovery FuncICA for time series pattern discovery Nishant Mehta and Alexander Gray Georgia Institute of Technology The problem Given a set of inherently continuous time series (e.g. EEG) Find a set of patterns

More information

Tutorial: Sparse Signal Recovery

Tutorial: Sparse Signal Recovery Tutorial: Sparse Signal Recovery Anna C. Gilbert Department of Mathematics University of Michigan (Sparse) Signal recovery problem signal or population length N k important Φ x = y measurements or tests:

More information

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation UIUC CSL Mar. 24 Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation Yuejie Chi Department of ECE and BMI Ohio State University Joint work with Yuxin Chen (Stanford).

More information

Spectral Clustering. Zitao Liu

Spectral Clustering. Zitao Liu Spectral Clustering Zitao Liu Agenda Brief Clustering Review Similarity Graph Graph Laplacian Spectral Clustering Algorithm Graph Cut Point of View Random Walk Point of View Perturbation Theory Point of

More information

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1 CHAPTER-3 A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching Graph matching problem has found many applications in areas as diverse as chemical structure analysis, pattern

More information

SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices)

SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Chapter 14 SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Today we continue the topic of low-dimensional approximation to datasets and matrices. Last time we saw the singular

More information

2.3. Clustering or vector quantization 57

2.3. Clustering or vector quantization 57 Multivariate Statistics non-negative matrix factorisation and sparse dictionary learning The PCA decomposition is by construction optimal solution to argmin A R n q,h R q p X AH 2 2 under constraint :

More information

Reconstruction of Graph Signals through Percolation from Seeding Nodes

Reconstruction of Graph Signals through Percolation from Seeding Nodes Reconstruction of Graph Signals through Percolation from Seeding Nodes Santiago Segarra, Antonio G. Marques, Geert Leus, and Alejandro Ribeiro Abstract New schemes to recover signals defined in the nodes

More information

Finding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October

Finding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October Finding normalized and modularity cuts by spectral clustering Marianna Bolla Institute of Mathematics Budapest University of Technology and Economics marib@math.bme.hu Ljubjana 2010, October Outline Find

More information

8.1 Concentration inequality for Gaussian random matrix (cont d)

8.1 Concentration inequality for Gaussian random matrix (cont d) MGMT 69: Topics in High-dimensional Data Analysis Falll 26 Lecture 8: Spectral clustering and Laplacian matrices Lecturer: Jiaming Xu Scribe: Hyun-Ju Oh and Taotao He, October 4, 26 Outline Concentration

More information

Irregularity-Aware Graph Fourier Transforms

Irregularity-Aware Graph Fourier Transforms 1 Irregularity-Aware Graph Fourier Transforms Benjamin Girault, Antonio Ortega Fellow, IEEE and Shrikanth S. Narayanan Fellow, IEEE Abstract In this paper, we present a novel generalization of the graph

More information

Fitting a Graph to Vector Data. Samuel I. Daitch (Yale) Jonathan A. Kelner (MIT) Daniel A. Spielman (Yale)

Fitting a Graph to Vector Data. Samuel I. Daitch (Yale) Jonathan A. Kelner (MIT) Daniel A. Spielman (Yale) Fitting a Graph to Vector Data Samuel I. Daitch (Yale) Jonathan A. Kelner (MIT) Daniel A. Spielman (Yale) Machine Learning Summer School, June 2009 Given a collection of vectors x 1,..., x n IR d 1,, n

More information

Lecture 8 : Eigenvalues and Eigenvectors

Lecture 8 : Eigenvalues and Eigenvectors CPS290: Algorithmic Foundations of Data Science February 24, 2017 Lecture 8 : Eigenvalues and Eigenvectors Lecturer: Kamesh Munagala Scribe: Kamesh Munagala Hermitian Matrices It is simpler to begin with

More information

arxiv: v1 [cs.ds] 5 Feb 2016

arxiv: v1 [cs.ds] 5 Feb 2016 Compressive spectral clustering Nicolas Tremblay, Gilles Puy, Rémi Gribonval, and Pierre Vandergheynst arxiv:6.8v [cs.ds] 5 Feb 6 Abstract. Spectral clustering has become a popular technique due to its

More information

Recurrence Relations and Fast Algorithms

Recurrence Relations and Fast Algorithms Recurrence Relations and Fast Algorithms Mark Tygert Research Report YALEU/DCS/RR-343 December 29, 2005 Abstract We construct fast algorithms for decomposing into and reconstructing from linear combinations

More information

Graph Functional Methods for Climate Partitioning

Graph Functional Methods for Climate Partitioning Graph Functional Methods for Climate Partitioning Mathilde Mougeot - with D. Picard, V. Lefieux*, M. Marchand* Université Paris Diderot, France *Réseau Transport Electrique (RTE) Buenos Aires, 2015 Mathilde

More information

Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering. Some Slides

Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering. Some Slides Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering Some Slides Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104,

More information

Sparse Subspace Clustering

Sparse Subspace Clustering Sparse Subspace Clustering Based on Sparse Subspace Clustering: Algorithm, Theory, and Applications by Elhamifar and Vidal (2013) Alex Gutierrez CSCI 8314 March 2, 2017 Outline 1 Motivation and Background

More information

Graphs, Vectors, and Matrices Daniel A. Spielman Yale University. AMS Josiah Willard Gibbs Lecture January 6, 2016

Graphs, Vectors, and Matrices Daniel A. Spielman Yale University. AMS Josiah Willard Gibbs Lecture January 6, 2016 Graphs, Vectors, and Matrices Daniel A. Spielman Yale University AMS Josiah Willard Gibbs Lecture January 6, 2016 From Applied to Pure Mathematics Algebraic and Spectral Graph Theory Sparsification: approximating

More information

Graph Signal Processing

Graph Signal Processing Graph Signal Processing Rahul Singh Data Science Reading Group Iowa State University March, 07 Outline Graph Signal Processing Background Graph Signal Processing Frameworks Laplacian Based Discrete Signal

More information

Eigenvalue problems and optimization

Eigenvalue problems and optimization Notes for 2016-04-27 Seeking structure For the past three weeks, we have discussed rather general-purpose optimization methods for nonlinear equation solving and optimization. In practice, of course, we

More information

Sum-of-Squares Method, Tensor Decomposition, Dictionary Learning

Sum-of-Squares Method, Tensor Decomposition, Dictionary Learning Sum-of-Squares Method, Tensor Decomposition, Dictionary Learning David Steurer Cornell Approximation Algorithms and Hardness, Banff, August 2014 for many problems (e.g., all UG-hard ones): better guarantees

More information

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture 7 What is spectral

More information

A fast randomized algorithm for approximating an SVD of a matrix

A fast randomized algorithm for approximating an SVD of a matrix A fast randomized algorithm for approximating an SVD of a matrix Joint work with Franco Woolfe, Edo Liberty, and Vladimir Rokhlin Mark Tygert Program in Applied Mathematics Yale University Place July 17,

More information

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications Class 19: Data Representation by Design What is data representation? Let X be a data-space X M (M) F (M) X A data representation

More information