Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach

Size: px
Start display at page:

Download "Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach"

Transcription

1 Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach Boaz Nadler The Weizmann Institute of Science Israel Joint works with Inbal Horev, Ronen Basri, Meirav Galun and Ery Arias-Castro Yi-Qing Wang, Alain Trouve, Yali Amit Roi Weiss, Chen Attias, Robert Krauthgamer Dec 2017 Boaz Nadler Sublinear Time Group Testing 1

2 Statistical Challenges related to big data In various applications (vision in particular), we collect so much data, that either 1) data does not fit / cannot be processed on single machine Boaz Nadler Sublinear Time Group Testing 2

3 Statistical Challenges related to big data In various applications (vision in particular), we collect so much data, that either 1) data does not fit / cannot be processed on single machine or 2) Standard algorithms that pass over all data are too slow / take too much computing power Boaz Nadler Sublinear Time Group Testing 2

4 Statistical Challenges related to big data In various applications (vision in particular), we collect so much data, that either 1) data does not fit / cannot be processed on single machine or 2) Standard algorithms that pass over all data are too slow / take too much computing power Common approach to handle setting (1) is distributed learning Boaz Nadler Sublinear Time Group Testing 2

5 Statistical Challenges related to big data In various applications (vision in particular), we collect so much data, that either 1) data does not fit / cannot be processed on single machine or 2) Standard algorithms that pass over all data are too slow / take too much computing power Common approach to handle setting (1) is distributed learning not focus of this talk, take a look at [Rosenblatt & N. 16 ] On the optimality of averaging in distributed statistical learning Boaz Nadler Sublinear Time Group Testing 2

6 Statistical challenges related to big data Focus of this talk: 2) Standard algorithms to solve a task are too slow Boaz Nadler Sublinear Time Group Testing 3

7 Statistical challenges related to big data Focus of this talk: 2) Standard algorithms to solve a task are too slow Two key challenges: [computational & practical] develop extremely fast algorithms (linear / sub-linear complexity) [theoretical] understand lower bounds on statistical accuracy under computational constraints Boaz Nadler Sublinear Time Group Testing 3

8 Statistical challenges related to big data Focus of this talk: 2) Standard algorithms to solve a task are too slow Two key challenges: [computational & practical] develop extremely fast algorithms (linear / sub-linear complexity) [theoretical] understand lower bounds on statistical accuracy under computational constraints In this talk: study these two challenges for (i) edge detection in large noisy images (ii) finding sparse representations in high dimensional dictionaries Boaz Nadler Sublinear Time Group Testing 3

9 Edge Detection Observe n 1 n 2 image I = array of pixel values Goal: Detect edges in image, typically boundaries between objects. Search for curves Γ such that at direction n - normal to curve Γ, gradient I n is large Boaz Nadler Sublinear Time Group Testing 4

10 Edge Detection Observe n 1 n 2 image I = array of pixel values Goal: Detect edges in image, typically boundaries between objects. Search for curves Γ such that at direction n - normal to curve Γ, gradient I n is large A fundamental task in low level image processing Boaz Nadler Sublinear Time Group Testing 4

11 Edge Detection Observe n 1 n 2 image I = array of pixel values Goal: Detect edges in image, typically boundaries between objects. Search for curves Γ such that at direction n - normal to curve Γ, gradient I n is large A fundamental task in low level image processing well studied problem, many algorithms well understood theory Boaz Nadler Sublinear Time Group Testing 4

12 Edge Detection at low SNR Our Interest: Edge detection in noisy and large 2D images and 3D video Motivation for large: high resolution images in many applications Motivation(s): for noisy images 1. Images at non-ideal conditions: poor lighting, fog, rain, night 2. surveillance applications 3. Real time object tracking in 3D video Boaz Nadler Sublinear Time Group Testing 5

13 Edge Detection at low SNR Our Interest: Edge detection in noisy and large 2D images and 3D video Motivation for large: high resolution images in many applications Motivation(s): for noisy images 1. Images at non-ideal conditions: poor lighting, fog, rain, night 2. surveillance applications 3. Real time object tracking in 3D video Image Prior: - Interested in long straight (or weakly curved) edges - Sparsity - image contains few edges Boaz Nadler Sublinear Time Group Testing 5

14 Example: Powerlines Boaz Nadler Sublinear Time Group Testing 6

15 Traditional Edge Detection Algorithms Typical Approach: Detect edges from local image gradients Example: Canny Edge Detector, complexity O(n 2 ) linear in total number of image pixels fast, possibly suitable for real-time Limitation: Does not work well at low SNR Boaz Nadler Sublinear Time Group Testing 7

16 Example: Canny, run-time 2.5sec Boaz Nadler Sublinear Time Group Testing 8

17 Example: Canny, run-time 2.5sec Cannot detect faint powerlines of second tower Boaz Nadler Sublinear Time Group Testing 8

18 Modern Sophisticated Methods - Statistical theory for limits of detectability [Arias-Castro, Donoho, Huo, 05] [Brandt, Galun, Basri, 07] [Alpert, Galun, Nadler, Basri, 10] [Ofir, Galun, Nadler, Basri, 15] - (Theoretically) efficient multiscale algorithms, robust to noise Boaz Nadler Sublinear Time Group Testing 9

19 Modern Sophisticated Methods - Statistical theory for limits of detectability [Arias-Castro, Donoho, Huo, 05] [Brandt, Galun, Basri, 07] [Alpert, Galun, Nadler, Basri, 10] [Ofir, Galun, Nadler, Basri, 15] - (Theoretically) efficient multiscale algorithms, robust to noise and yet slow Boaz Nadler Sublinear Time Group Testing 9

20 Modern Sophisticated Methods - Statistical theory for limits of detectability [Arias-Castro, Donoho, Huo, 05] [Brandt, Galun, Basri, 07] [Alpert, Galun, Nadler, Basri, 10] [Ofir, Galun, Nadler, Basri, 15] - (Theoretically) efficient multiscale algorithms, robust to noise and yet slow Run time: O(min) for large images, O(hours) for video Boaz Nadler Sublinear Time Group Testing 9

21 Example: Straight Segment Detector, run-time 5 min Boaz Nadler Sublinear Time Group Testing 10

22 Challenge: Sublinear Time Edge Detection Goal: Devise edge detection algorith, that is (i) robust to noise and (ii) extremely fast Boaz Nadler Sublinear Time Group Testing 11

23 Challenge: Sublinear Time Edge Detection Goal: Devise edge detection algorith, that is (i) robust to noise and (ii) extremely fast Given noisy n n image I, detect long straight edges in sublinear time Boaz Nadler Sublinear Time Group Testing 11

24 Challenge: Sublinear Time Edge Detection Goal: Devise edge detection algorith, that is (i) robust to noise and (ii) extremely fast Given noisy n n image I, detect long straight edges in sublinear time complexity O(n α ) with α < 2 Boaz Nadler Sublinear Time Group Testing 11

25 Challenge: Sublinear Time Edge Detection Goal: Devise edge detection algorith, that is (i) robust to noise and (ii) extremely fast Given noisy n n image I, detect long straight edges in sublinear time complexity O(n α ) with α < 2 touching only a fraction of the image/video pixels! Boaz Nadler Sublinear Time Group Testing 11

26 Challenge: Sublinear Time Edge Detection Goal: Devise edge detection algorith, that is (i) robust to noise and (ii) extremely fast Given noisy n n image I, detect long straight edges in Questions: sublinear time complexity O(n α ) with α < 2 touching only a fraction of the image/video pixels! a) Statistical: which edge strengths can one detect vs. α? b) Computational: optimal sampling scheme? c) Practical: sub-linear time algorithm? Boaz Nadler Sublinear Time Group Testing 11

27 Problem Setup Observe n n noisy image I = I 0 + ξ I 0 - noise free image ξ - i.i.d. additive noise, zero mean, variance σ 2 Boaz Nadler Sublinear Time Group Testing 12

28 Problem Setup Observe n n noisy image I = I 0 + ξ I 0 - noise free image ξ - i.i.d. additive noise, zero mean, variance σ 2 Goal: Detect edges in I 0 from noisy I Boaz Nadler Sublinear Time Group Testing 12

29 Problem Setup Observe n n noisy image I = I 0 + ξ I 0 - noise free image ξ - i.i.d. additive noise, zero mean, variance σ 2 Goal: Detect edges in I 0 from noisy I Assumptions: - Clean image I 0 contains few step edges (sparsity) Boaz Nadler Sublinear Time Group Testing 12

30 Problem Setup Observe n n noisy image I = I 0 + ξ I 0 - noise free image ξ - i.i.d. additive noise, zero mean, variance σ 2 Goal: Detect edges in I 0 from noisy I Assumptions: - Clean image I 0 contains few step edges (sparsity) - Edges of interest are straight and sufficiently long Boaz Nadler Sublinear Time Group Testing 12

31 Problem Setup Observe n n noisy image I = I 0 + ξ I 0 - noise free image ξ - i.i.d. additive noise, zero mean, variance σ 2 Goal: Detect edges in I 0 from noisy I Assumptions: - Clean image I 0 contains few step edges (sparsity) - Edges of interest are straight and sufficiently long Definition: Edge Signal to Noise Ratio = edge contrast/σ. Boaz Nadler Sublinear Time Group Testing 12

32 Theoretical Questions Given sub-linear budget: 1) what are optimal sampling schemes? 2) what are fundamental limitations on sub-linear edge detection? 3) what is the tradeoff between statistical accuracy and computational complexity? Boaz Nadler Sublinear Time Group Testing 13

33 Optimal Sublinear Edge Detection For theoretical analysis, consider following class of images: I = {I contains only noise or one long fiber plus noise} Boaz Nadler Sublinear Time Group Testing 14

34 Optimal Sublinear Edge Detection For theoretical analysis, consider following class of images: I = {I contains only noise or one long fiber plus noise} Boaz Nadler Sublinear Time Group Testing 14

35 Fundamental Limitations / Design Principles Focus on detection under worst-case scenario Boaz Nadler Sublinear Time Group Testing 15

36 Fundamental Limitations / Design Principles Focus on detection under worst-case scenario Lemma: If number of observed pixels is n α with α < 1 then there exists I I whose edges cannot be detected Boaz Nadler Sublinear Time Group Testing 15

37 Fundamental Limitations / Design Principles Focus on detection under worst-case scenario Lemma: If number of observed pixels is n α with α < 1 then there exists I I whose edges cannot be detected Theorem: Assume number of observed pixels is s and s/n is integer. Then, i) any optimal sampling scheme must observe exactly s/n pixels per row. ii) sampling s/n whole columns is an optimal scheme Boaz Nadler Sublinear Time Group Testing 15

38 Statistical Accuracy vs. Computational Complexity Definition: Edge SNR = edge contrast / noise level Boaz Nadler Sublinear Time Group Testing 16

39 Statistical Accuracy vs. Computational Complexity Definition: Edge SNR = edge contrast / noise level Theorem: At complexity O(n α ), with α 1, SNR ln n/n α ln(n) SNR not possible lnn/n α α Boaz Nadler Sublinear Time Group Testing 16

40 Sublinear Edge Detection Algorithm Key Idea: Sample few image strips Boaz Nadler Sublinear Time Group Testing 17

41 Sublinear Edge Detection Algorithm Key Idea: Sample few image strips first detect edges in strips Boaz Nadler Sublinear Time Group Testing 17

42 Sublinear Edge Detection Algorithm Key Idea: Sample few image strips first detect edges in strips next: non-maximal suppression, edge localization Boaz Nadler Sublinear Time Group Testing 17

43 Example: NOISY IMAGE, SNR=1 Boaz Nadler Sublinear Time Group Testing 18

44 Example: NOISY IMAGE, SNR=1 CANNY Boaz Nadler Sublinear Time Group Testing 18

45 Example: NOISY IMAGE, SNR=1 CANNY SUB LINEAR Boaz Nadler Sublinear Time Group Testing 18

46 Sublinear Edge Detection, run-time few seconds Boaz Nadler Sublinear Time Group Testing 19

47 Some Results on Real Images Detection with roughly 10% of pixels sampled. Boaz Nadler Sublinear Time Group Testing 20

48 Real Images Boaz Nadler Sublinear Time Group Testing 21

49 Part II: Sparse Representations in High Dimensions Problem Setup: [with Chen Attias and Roi Weiss] A dictionary Φ = [ϕ 1, ϕ 2,..., ϕ N ] R p N Each atom ϕ i normalized ϕ i = 1 High-dimensional p 1 Possibly Redundant N = Jp with J 1 Definition: A signal s R p is m-sparse over dictionary Φ if s = Φα where supp(α) = m p Boaz Nadler Sublinear Time Group Testing 22

50 Part II: Sparse Representations in High Dimensions Problem Setup: [with Chen Attias and Roi Weiss] A dictionary Φ = [ϕ 1, ϕ 2,..., ϕ N ] R p N Each atom ϕ i normalized ϕ i = 1 High-dimensional p 1 Possibly Redundant N = Jp with J 1 Definition: A signal s R p is m-sparse over dictionary Φ if s = Φα where supp(α) = m p Goal: Given (noisy version of) s, find α. Boaz Nadler Sublinear Time Group Testing 22

51 Part II: Sparse Representations in High Dimensions Problem Setup: [with Chen Attias and Roi Weiss] A dictionary Φ = [ϕ 1, ϕ 2,..., ϕ N ] R p N Each atom ϕ i normalized ϕ i = 1 High-dimensional p 1 Possibly Redundant N = Jp with J 1 Definition: A signal s R p is m-sparse over dictionary Φ if s = Φα where supp(α) = m p Goal: Given (noisy version of) s, find α. Applications: Image and signal analysis. Boaz Nadler Sublinear Time Group Testing 22

52 Computing Sparse Representation [Davis et al, 97 ] With no assumptions on Φ and on m, problem is NP-hard Key challenge: find the support. Once supp(α) is known, recovering α requires O(pm 2 ) operations. Boaz Nadler Sublinear Time Group Testing 23

53 Computing Sparse Representation [Davis et al, 97 ] With no assumptions on Φ and on m, problem is NP-hard Key challenge: find the support. Once supp(α) is known, recovering α requires O(pm 2 ) operations. Definition: The coherence µ of a dictionary Φ is µ = max ϕ i, ϕ j i j m-sparse signal satisfies MUTUAL-INCOHERENCE-PROPERTY (MIP) if (2m 1)µ < 1 Boaz Nadler Sublinear Time Group Testing 23

54 Orthogonal Matching Pursuit [Donoho & Elad 03, Tropp 04, others...] Theorem: Suppose m-sparse signal s = Φα satisfies MIP condition. Then, solution of Basis-Pursuit (BP) problem arg min x R N x 1 s.t. s = Φx recovers representation α exactly Orthogonal Matching Pursuit (OMP) also recovers α exactly Boaz Nadler Sublinear Time Group Testing 24

55 Orthogonal Matching Pursuit [Donoho & Elad 03, Tropp 04, others...] Theorem: Suppose m-sparse signal s = Φα satisfies MIP condition. Then, solution of Basis-Pursuit (BP) problem arg min x R N x 1 s.t. s = Φx recovers representation α exactly Orthogonal Matching Pursuit (OMP) also recovers α exactly Time complexity of OMP is O(mp 2 ) and of BP even higher. Boaz Nadler Sublinear Time Group Testing 24

56 Orthogonal Matching Pursuit [Donoho & Elad 03, Tropp 04, others...] Theorem: Suppose m-sparse signal s = Φα satisfies MIP condition. Then, solution of Basis-Pursuit (BP) problem arg min x R N x 1 s.t. s = Φx recovers representation α exactly Orthogonal Matching Pursuit (OMP) also recovers α exactly Time complexity of OMP is O(mp 2 ) and of BP even higher. Can one compute a sparse representation faster? Boaz Nadler Sublinear Time Group Testing 24

57 Runtime of Orthogonal Matching Pursuit Key quantity for identifying significant atoms I k c k = Φ r k 1 R N with all inner products between r k 1 and all N atoms in Φ Boaz Nadler Sublinear Time Group Testing 25

58 Runtime of Orthogonal Matching Pursuit Key quantity for identifying significant atoms I k c k = Φ r k 1 R N with all inner products between r k 1 and all N atoms in Φ Computing all N inner products Boaz Nadler Sublinear Time Group Testing 25

59 Runtime of Orthogonal Matching Pursuit Key quantity for identifying significant atoms I k c k = Φ r k 1 R N with all inner products between r k 1 and all N atoms in Φ Computing all N inner products For structured Φ runtime N log N (Fourier, wavelet, sparse-graph codes,...) Boaz Nadler Sublinear Time Group Testing 25

60 Runtime of Orthogonal Matching Pursuit Key quantity for identifying significant atoms I k c k = Φ r k 1 R N with all inner products between r k 1 and all N atoms in Φ Computing all N inner products For structured Φ runtime N log N (Fourier, wavelet, sparse-graph codes,...) For non-structured Φ runtime Np Boaz Nadler Sublinear Time Group Testing 25

61 Runtime of Orthogonal Matching Pursuit Key quantity for identifying significant atoms I k c k = Φ r k 1 R N with all inner products between r k 1 and all N atoms in Φ Computing all N inner products For structured Φ runtime N log N (Fourier, wavelet, sparse-graph codes,...) For non-structured Φ runtime Np If N = Jp runtime quadratic in signal dimension O(p 2 ) Boaz Nadler Sublinear Time Group Testing 25

62 Runtime of Orthogonal Matching Pursuit Key quantity for identifying significant atoms I k c k = Φ r k 1 R N with all inner products between r k 1 and all N atoms in Φ Computing all N inner products For structured Φ runtime N log N (Fourier, wavelet, sparse-graph codes,...) For non-structured Φ runtime Np If N = Jp runtime quadratic in signal dimension O(p 2 ) Question: Identify largest entries of c k in nearly-linear time? Boaz Nadler Sublinear Time Group Testing 25

63 Our contribution: A group testing approach Use (random) Tree-Based Statistical-Group-Testing data structure Boaz Nadler Sublinear Time Group Testing 26

64 Our contribution: A group testing approach Use (random) Tree-Based Statistical-Group-Testing data structure Finds at least one atom from r k 1 s representation w.h.p. Boaz Nadler Sublinear Time Group Testing 26

65 Our contribution: A group testing approach Use (random) Tree-Based Statistical-Group-Testing data structure Finds at least one atom from r k 1 s representation w.h.p. If coherence µ = O(1/ p) O ( pm 2 log(pm) ) time Boaz Nadler Sublinear Time Group Testing 26

66 Our contribution: A group testing approach Use (random) Tree-Based Statistical-Group-Testing data structure Finds at least one atom from r k 1 s representation w.h.p. If coherence µ = O(1/ p) O ( pm 2 log(pm) ) time Memory: O(Np log (pm)) Boaz Nadler Sublinear Time Group Testing 26

67 Our contribution: A group testing approach Use (random) Tree-Based Statistical-Group-Testing data structure Finds at least one atom from r k 1 s representation w.h.p. If coherence µ = O(1/ p) O ( pm 2 log(pm) ) time Memory: O(Np log (pm)) OMP + Statistical-Group-Testing (GT-OMP) At most m iterations for exact recovery Boaz Nadler Sublinear Time Group Testing 26

68 Our contribution: A group testing approach Use (random) Tree-Based Statistical-Group-Testing data structure Finds at least one atom from r k 1 s representation w.h.p. If coherence µ = O(1/ p) O ( pm 2 log(pm) ) time Memory: O(Np log (pm)) OMP + Statistical-Group-Testing (GT-OMP) At most m iterations for exact recovery Total runtime O ( pm 3 log (pm) ) Boaz Nadler Sublinear Time Group Testing 26

69 Our contribution: A group testing approach Use (random) Tree-Based Statistical-Group-Testing data structure Finds at least one atom from r k 1 s representation w.h.p. If coherence µ = O(1/ p) O ( pm 2 log(pm) ) time Memory: O(Np log (pm)) OMP + Statistical-Group-Testing (GT-OMP) At most m iterations for exact recovery Total runtime O ( pm 3 log (pm) ) For sparsity m = o(p 1/3 ) runtime sub-quadratic in p (recall MIP requires m = O(p 1/2 )) Boaz Nadler Sublinear Time Group Testing 26

70 Our contribution: A group testing approach Use (random) Tree-Based Statistical-Group-Testing data structure Finds at least one atom from r k 1 s representation w.h.p. If coherence µ = O(1/ p) O ( pm 2 log(pm) ) time Memory: O(Np log (pm)) OMP + Statistical-Group-Testing (GT-OMP) At most m iterations for exact recovery Total runtime O ( pm 3 log (pm) ) For sparsity m = o(p 1/3 ) runtime sub-quadratic in p (recall MIP requires m = O(p 1/2 )) For sparsity m = O(log p) runtime near-linear in p Boaz Nadler Sublinear Time Group Testing 26

71 Simulations results random Gaussian dictionary N = 2p, Φ ij N(0, 1/ p). sparsity m = 4 log 2 p Compared 4 algorithms: OMP [Pati et al 93, Mallat & Zhang 92 ] OMP-threshold [Yang & de Hoog 15 ] Stagewise OMP [Donoho et al 12 ] GT-OMP (ours) Averaged over 25 random signals All 4 algorithms 100% success in recovering all 25 signals Boaz Nadler Sublinear Time Group Testing 27

72 Runtime vs. dimension no. of inner products OMP OMP T StOMP GT OMP dimension p Boaz Nadler Sublinear Time Group Testing 28

73 Summary Increasing need for fast (possibly sub-linear) algorithms in various big-data applications Considered edge detection and sparse representation Boaz Nadler Sublinear Time Group Testing 29

74 Summary Increasing need for fast (possibly sub-linear) algorithms in various big-data applications Considered edge detection and sparse representation Common Theme to both problems: - Formulate estimation as a search in a very large space of possible hypothesis - Construct coarse tests that rule out many hypothesis at once - Perform more expensive/accurate tests on remaining hypotheses Boaz Nadler Sublinear Time Group Testing 29

75 Summary Increasing need for fast (possibly sub-linear) algorithms in various big-data applications Considered edge detection and sparse representation Common Theme to both problems: - Formulate estimation as a search in a very large space of possible hypothesis - Construct coarse tests that rule out many hypothesis at once - Perform more expensive/accurate tests on remaining hypotheses Similar ideas: Gilbert et al, Willett et al 14, Meinshausen et al 09, Haupt et al,... Boaz Nadler Sublinear Time Group Testing 29

76 Summary Open Questions: - Can similar approach solve other inference/learning problems? - Precisely quantify statistical vs. computational tradeoffs for other problems? Boaz Nadler Sublinear Time Group Testing 30

77 The End Still a long way to go Thank you! Boaz Nadler Sublinear Time Group Testing 31

Sensing systems limited by constraints: physical size, time, cost, energy

Sensing systems limited by constraints: physical size, time, cost, energy Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original

More information

An Introduction to Sparse Approximation

An Introduction to Sparse Approximation An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,

More information

Sparse analysis Lecture III: Dictionary geometry and greedy algorithms

Sparse analysis Lecture III: Dictionary geometry and greedy algorithms Sparse analysis Lecture III: Dictionary geometry and greedy algorithms Anna C. Gilbert Department of Mathematics University of Michigan Intuition from ONB Key step in algorithm: r, ϕ j = x c i ϕ i, ϕ j

More information

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence

More information

The Pros and Cons of Compressive Sensing

The Pros and Cons of Compressive Sensing The Pros and Cons of Compressive Sensing Mark A. Davenport Stanford University Department of Statistics Compressive Sensing Replace samples with general linear measurements measurements sampled signal

More information

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March

More information

MATCHING PURSUIT WITH STOCHASTIC SELECTION

MATCHING PURSUIT WITH STOCHASTIC SELECTION 2th European Signal Processing Conference (EUSIPCO 22) Bucharest, Romania, August 27-3, 22 MATCHING PURSUIT WITH STOCHASTIC SELECTION Thomas Peel, Valentin Emiya, Liva Ralaivola Aix-Marseille Université

More information

Sparse & Redundant Signal Representation, and its Role in Image Processing

Sparse & Redundant Signal Representation, and its Role in Image Processing Sparse & Redundant Signal Representation, and its Role in Michael Elad The CS Department The Technion Israel Institute of technology Haifa 3000, Israel Wave 006 Wavelet and Applications Ecole Polytechnique

More information

Part IV Compressed Sensing

Part IV Compressed Sensing Aisenstadt Chair Course CRM September 2009 Part IV Compressed Sensing Stéphane Mallat Centre de Mathématiques Appliquées Ecole Polytechnique Conclusion to Super-Resolution Sparse super-resolution is sometime

More information

Generalized Orthogonal Matching Pursuit- A Review and Some

Generalized Orthogonal Matching Pursuit- A Review and Some Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents

More information

Compressed Sensing: Extending CLEAN and NNLS

Compressed Sensing: Extending CLEAN and NNLS Compressed Sensing: Extending CLEAN and NNLS Ludwig Schwardt SKA South Africa (KAT Project) Calibration & Imaging Workshop Socorro, NM, USA 31 March 2009 Outline 1 Compressed Sensing (CS) Introduction

More information

A new method on deterministic construction of the measurement matrix in compressed sensing

A new method on deterministic construction of the measurement matrix in compressed sensing A new method on deterministic construction of the measurement matrix in compressed sensing Qun Mo 1 arxiv:1503.01250v1 [cs.it] 4 Mar 2015 Abstract Construction on the measurement matrix A is a central

More information

Greedy Signal Recovery and Uniform Uncertainty Principles

Greedy Signal Recovery and Uniform Uncertainty Principles Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles

More information

Simultaneous Sparsity

Simultaneous Sparsity Simultaneous Sparsity Joel A. Tropp Anna C. Gilbert Martin J. Strauss {jtropp annacg martinjs}@umich.edu Department of Mathematics The University of Michigan 1 Simple Sparse Approximation Work in the d-dimensional,

More information

Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France

Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Structure of the tutorial Session 1: Introduction to inverse problems & sparse

More information

Compressive Sensing and Beyond

Compressive Sensing and Beyond Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered

More information

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery Anna C. Gilbert Department of Mathematics University of Michigan Connection between... Sparse Approximation and Compressed

More information

Multipath Matching Pursuit

Multipath Matching Pursuit Multipath Matching Pursuit Submitted to IEEE trans. on Information theory Authors: S. Kwon, J. Wang, and B. Shim Presenter: Hwanchol Jang Multipath is investigated rather than a single path for a greedy

More information

EUSIPCO

EUSIPCO EUSIPCO 013 1569746769 SUBSET PURSUIT FOR ANALYSIS DICTIONARY LEARNING Ye Zhang 1,, Haolong Wang 1, Tenglong Yu 1, Wenwu Wang 1 Department of Electronic and Information Engineering, Nanchang University,

More information

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp CoSaMP Iterative signal recovery from incomplete and inaccurate samples Joel A. Tropp Applied & Computational Mathematics California Institute of Technology jtropp@acm.caltech.edu Joint with D. Needell

More information

Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries

Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries Jarvis Haupt University of Minnesota Department of Electrical and Computer Engineering Supported by Motivation New Agile Sensing

More information

A tutorial on sparse modeling. Outline:

A tutorial on sparse modeling. Outline: A tutorial on sparse modeling. Outline: 1. Why? 2. What? 3. How. 4. no really, why? Sparse modeling is a component in many state of the art signal processing and machine learning tasks. image processing

More information

Sparsity in Underdetermined Systems

Sparsity in Underdetermined Systems Sparsity in Underdetermined Systems Department of Statistics Stanford University August 19, 2005 Classical Linear Regression Problem X n y p n 1 > Given predictors and response, y Xβ ε = + ε N( 0, σ 2

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

SPARSE signal representations have gained popularity in recent

SPARSE signal representations have gained popularity in recent 6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying

More information

Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise

Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published

More information

Recent Developments in Compressed Sensing

Recent Developments in Compressed Sensing Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline

More information

Compressed Sensing and Neural Networks

Compressed Sensing and Neural Networks and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications

More information

Sparse representation classification and positive L1 minimization

Sparse representation classification and positive L1 minimization Sparse representation classification and positive L1 minimization Cencheng Shen Joint Work with Li Chen, Carey E. Priebe Applied Mathematics and Statistics Johns Hopkins University, August 5, 2014 Cencheng

More information

Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Final Presentation

Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Final Presentation Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Final Presentation Alfredo Nava-Tudela John J. Benedetto, advisor 5/10/11 AMSC 663/664 1 Problem Let A be an n

More information

Compressed Sensing and Linear Codes over Real Numbers

Compressed Sensing and Linear Codes over Real Numbers Compressed Sensing and Linear Codes over Real Numbers Henry D. Pfister (joint with Fan Zhang) Texas A&M University College Station Information Theory and Applications Workshop UC San Diego January 31st,

More information

Recent developments on sparse representation

Recent developments on sparse representation Recent developments on sparse representation Zeng Tieyong Department of Mathematics, Hong Kong Baptist University Email: zeng@hkbu.edu.hk Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last

More information

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional

More information

c 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE

c 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE METHODS AND APPLICATIONS OF ANALYSIS. c 2011 International Press Vol. 18, No. 1, pp. 105 110, March 2011 007 EXACT SUPPORT RECOVERY FOR LINEAR INVERSE PROBLEMS WITH SPARSITY CONSTRAINTS DENNIS TREDE Abstract.

More information

Equivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation

Equivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 19, NO. 12, DECEMBER 2008 2009 Equivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation Yuanqing Li, Member, IEEE, Andrzej Cichocki,

More information

ECS289: Scalable Machine Learning

ECS289: Scalable Machine Learning ECS289: Scalable Machine Learning Cho-Jui Hsieh UC Davis Oct 18, 2016 Outline One versus all/one versus one Ranking loss for multiclass/multilabel classification Scaling to millions of labels Multiclass

More information

of Orthogonal Matching Pursuit

of Orthogonal Matching Pursuit A Sharp Restricted Isometry Constant Bound of Orthogonal Matching Pursuit Qun Mo arxiv:50.0708v [cs.it] 8 Jan 205 Abstract We shall show that if the restricted isometry constant (RIC) δ s+ (A) of the measurement

More information

Sparse Solutions of Systems of Equations and Sparse Modelling of Signals and Images

Sparse Solutions of Systems of Equations and Sparse Modelling of Signals and Images Sparse Solutions of Systems of Equations and Sparse Modelling of Signals and Images Alfredo Nava-Tudela ant@umd.edu John J. Benedetto Department of Mathematics jjb@umd.edu Abstract In this project we are

More information

Randomness-in-Structured Ensembles for Compressed Sensing of Images

Randomness-in-Structured Ensembles for Compressed Sensing of Images Randomness-in-Structured Ensembles for Compressed Sensing of Images Abdolreza Abdolhosseini Moghadam Dep. of Electrical and Computer Engineering Michigan State University Email: abdolhos@msu.edu Hayder

More information

Sparse linear models

Sparse linear models Sparse linear models Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 2/22/2016 Introduction Linear transforms Frequency representation Short-time

More information

Applied Machine Learning for Biomedical Engineering. Enrico Grisan

Applied Machine Learning for Biomedical Engineering. Enrico Grisan Applied Machine Learning for Biomedical Engineering Enrico Grisan enrico.grisan@dei.unipd.it Data representation To find a representation that approximates elements of a signal class with a linear combination

More information

Review: Learning Bimodal Structures in Audio-Visual Data

Review: Learning Bimodal Structures in Audio-Visual Data Review: Learning Bimodal Structures in Audio-Visual Data CSE 704 : Readings in Joint Visual, Lingual and Physical Models and Inference Algorithms Suren Kumar Vision and Perceptual Machines Lab 106 Davis

More information

The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences

The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences Yuxin Chen Emmanuel Candès Department of Statistics, Stanford University, Sep. 2016 Nonconvex optimization

More information

The Sparsity Gap. Joel A. Tropp. Computing & Mathematical Sciences California Institute of Technology

The Sparsity Gap. Joel A. Tropp. Computing & Mathematical Sciences California Institute of Technology The Sparsity Gap Joel A. Tropp Computing & Mathematical Sciences California Institute of Technology jtropp@acm.caltech.edu Research supported in part by ONR 1 Introduction The Sparsity Gap (Casazza Birthday

More information

Lecture 22: More On Compressed Sensing

Lecture 22: More On Compressed Sensing Lecture 22: More On Compressed Sensing Scribed by Eric Lee, Chengrun Yang, and Sebastian Ament Nov. 2, 207 Recap and Introduction Basis pursuit was the method of recovering the sparsest solution to an

More information

Signal Recovery from Permuted Observations

Signal Recovery from Permuted Observations EE381V Course Project Signal Recovery from Permuted Observations 1 Problem Shanshan Wu (sw33323) May 8th, 2015 We start with the following problem: let s R n be an unknown n-dimensional real-valued signal,

More information

Elaine T. Hale, Wotao Yin, Yin Zhang

Elaine T. Hale, Wotao Yin, Yin Zhang , Wotao Yin, Yin Zhang Department of Computational and Applied Mathematics Rice University McMaster University, ICCOPT II-MOPTA 2007 August 13, 2007 1 with Noise 2 3 4 1 with Noise 2 3 4 1 with Noise 2

More information

On the Minimization Over Sparse Symmetric Sets: Projections, O. Projections, Optimality Conditions and Algorithms

On the Minimization Over Sparse Symmetric Sets: Projections, O. Projections, Optimality Conditions and Algorithms On the Minimization Over Sparse Symmetric Sets: Projections, Optimality Conditions and Algorithms Amir Beck Technion - Israel Institute of Technology Haifa, Israel Based on joint work with Nadav Hallak

More information

The Pros and Cons of Compressive Sensing

The Pros and Cons of Compressive Sensing The Pros and Cons of Compressive Sensing Mark A. Davenport Stanford University Department of Statistics Compressive Sensing Replace samples with general linear measurements measurements sampled signal

More information

Sparsity Regularization

Sparsity Regularization Sparsity Regularization Bangti Jin Course Inverse Problems & Imaging 1 / 41 Outline 1 Motivation: sparsity? 2 Mathematical preliminaries 3 l 1 solvers 2 / 41 problem setup finite-dimensional formulation

More information

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk Model-Based Compressive Sensing for Signal Ensembles Marco F. Duarte Volkan Cevher Richard G. Baraniuk Concise Signal Structure Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional

More information

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28 Sparsity Models Tong Zhang Rutgers University T. Zhang (Rutgers) Sparsity Models 1 / 28 Topics Standard sparse regression model algorithms: convex relaxation and greedy algorithm sparse recovery analysis:

More information

Morphological Diversity and Source Separation

Morphological Diversity and Source Separation Morphological Diversity and Source Separation J. Bobin, Y. Moudden, J.-L. Starck, and M. Elad Abstract This paper describes a new method for blind source separation, adapted to the case of sources having

More information

Lecture Notes 5: Multiresolution Analysis

Lecture Notes 5: Multiresolution Analysis Optimization-based data analysis Fall 2017 Lecture Notes 5: Multiresolution Analysis 1 Frames A frame is a generalization of an orthonormal basis. The inner products between the vectors in a frame and

More information

A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades

A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades Ngoc Hung Nguyen 1, Hai-Tan Tran 2, Kutluyıl Doğançay 1 and Rocco Melino 2 1 University of South Australia 2

More information

Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images!

Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images! Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images! Alfredo Nava-Tudela John J. Benedetto, advisor 1 Happy birthday Lucía! 2 Outline - Problem: Find sparse solutions

More information

Robust Principal Component Analysis

Robust Principal Component Analysis ELE 538B: Mathematics of High-Dimensional Data Robust Principal Component Analysis Yuxin Chen Princeton University, Fall 2018 Disentangling sparse and low-rank matrices Suppose we are given a matrix M

More information

Tutorial: Sparse Signal Recovery

Tutorial: Sparse Signal Recovery Tutorial: Sparse Signal Recovery Anna C. Gilbert Department of Mathematics University of Michigan (Sparse) Signal recovery problem signal or population length N k important Φ x = y measurements or tests:

More information

Rapidly Computing Sparse Chebyshev and Legendre Coefficient Expansions via SFTs

Rapidly Computing Sparse Chebyshev and Legendre Coefficient Expansions via SFTs Rapidly Computing Sparse Chebyshev and Legendre Coefficient Expansions via SFTs Mark Iwen Michigan State University October 18, 2014 Work with Janice (Xianfeng) Hu Graduating in May! M.A. Iwen (MSU) Fast

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional

More information

Reconstruction from Anisotropic Random Measurements

Reconstruction from Anisotropic Random Measurements Reconstruction from Anisotropic Random Measurements Mark Rudelson and Shuheng Zhou The University of Michigan, Ann Arbor Coding, Complexity, and Sparsity Workshop, 013 Ann Arbor, Michigan August 7, 013

More information

A NEW FRAMEWORK FOR DESIGNING INCOHERENT SPARSIFYING DICTIONARIES

A NEW FRAMEWORK FOR DESIGNING INCOHERENT SPARSIFYING DICTIONARIES A NEW FRAMEWORK FOR DESIGNING INCOERENT SPARSIFYING DICTIONARIES Gang Li, Zhihui Zhu, 2 uang Bai, 3 and Aihua Yu 3 School of Automation & EE, Zhejiang Univ. of Sci. & Tech., angzhou, Zhejiang, P.R. China

More information

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 Machine Learning for Signal Processing Sparse and Overcomplete Representations Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 1 Key Topics in this Lecture Basics Component-based representations

More information

Noise Removal? The Evolution Of Pr(x) Denoising By Energy Minimization. ( x) An Introduction to Sparse Representation and the K-SVD Algorithm

Noise Removal? The Evolution Of Pr(x) Denoising By Energy Minimization. ( x) An Introduction to Sparse Representation and the K-SVD Algorithm Sparse Representation and the K-SV Algorithm he CS epartment he echnion Israel Institute of technology Haifa 3, Israel University of Erlangen - Nürnberg April 8 Noise Removal? Our story begins with image

More information

arxiv: v1 [cs.it] 21 Feb 2013

arxiv: v1 [cs.it] 21 Feb 2013 q-ary Compressive Sensing arxiv:30.568v [cs.it] Feb 03 Youssef Mroueh,, Lorenzo Rosasco, CBCL, CSAIL, Massachusetts Institute of Technology LCSL, Istituto Italiano di Tecnologia and IIT@MIT lab, Istituto

More information

Sketching for Large-Scale Learning of Mixture Models

Sketching for Large-Scale Learning of Mixture Models Sketching for Large-Scale Learning of Mixture Models Nicolas Keriven Université Rennes 1, Inria Rennes Bretagne-atlantique Adv. Rémi Gribonval Outline Introduction Practical Approach Results Theoretical

More information

Stability and Robustness of Weak Orthogonal Matching Pursuits

Stability and Robustness of Weak Orthogonal Matching Pursuits Stability and Robustness of Weak Orthogonal Matching Pursuits Simon Foucart, Drexel University Abstract A recent result establishing, under restricted isometry conditions, the success of sparse recovery

More information

ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION

ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion Israel Institute o technology Haia 3000, Israel * SIAM Conerence on Imaging Science

More information

Recovery Guarantees for Rank Aware Pursuits

Recovery Guarantees for Rank Aware Pursuits BLANCHARD AND DAVIES: RECOVERY GUARANTEES FOR RANK AWARE PURSUITS 1 Recovery Guarantees for Rank Aware Pursuits Jeffrey D. Blanchard and Mike E. Davies Abstract This paper considers sufficient conditions

More information

Multiple Change Point Detection by Sparse Parameter Estimation

Multiple Change Point Detection by Sparse Parameter Estimation Multiple Change Point Detection by Sparse Parameter Estimation Department of Econometrics Fac. of Economics and Management University of Defence Brno, Czech Republic Dept. of Appl. Math. and Comp. Sci.

More information

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications Class 19: Data Representation by Design What is data representation? Let X be a data-space X M (M) F (M) X A data representation

More information

The Analysis Cosparse Model for Signals and Images

The Analysis Cosparse Model for Signals and Images The Analysis Cosparse Model for Signals and Images Raja Giryes Computer Science Department, Technion. The research leading to these results has received funding from the European Research Council under

More information

Non-convex Robust PCA: Provable Bounds

Non-convex Robust PCA: Provable Bounds Non-convex Robust PCA: Provable Bounds Anima Anandkumar U.C. Irvine Joint work with Praneeth Netrapalli, U.N. Niranjan, Prateek Jain and Sujay Sanghavi. Learning with Big Data High Dimensional Regime Missing

More information

Compressed Sensing with Very Sparse Gaussian Random Projections

Compressed Sensing with Very Sparse Gaussian Random Projections Compressed Sensing with Very Sparse Gaussian Random Projections arxiv:408.504v stat.me] Aug 04 Ping Li Department of Statistics and Biostatistics Department of Computer Science Rutgers University Piscataway,

More information

GREEDY SIGNAL RECOVERY REVIEW

GREEDY SIGNAL RECOVERY REVIEW GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin

More information

1 Regression with High Dimensional Data

1 Regression with High Dimensional Data 6.883 Learning with Combinatorial Structure ote for Lecture 11 Instructor: Prof. Stefanie Jegelka Scribe: Xuhong Zhang 1 Regression with High Dimensional Data Consider the following regression problem:

More information

Fast Reconstruction Algorithms for Deterministic Sensing Matrices and Applications

Fast Reconstruction Algorithms for Deterministic Sensing Matrices and Applications Fast Reconstruction Algorithms for Deterministic Sensing Matrices and Applications Program in Applied and Computational Mathematics Princeton University NJ 08544, USA. Introduction What is Compressive

More information

Inverse Power Method for Non-linear Eigenproblems

Inverse Power Method for Non-linear Eigenproblems Inverse Power Method for Non-linear Eigenproblems Matthias Hein and Thomas Bühler Anubhav Dwivedi Department of Aerospace Engineering & Mechanics 7th March, 2017 1 / 30 OUTLINE Motivation Non-Linear Eigenproblems

More information

Pre-weighted Matching Pursuit Algorithms for Sparse Recovery

Pre-weighted Matching Pursuit Algorithms for Sparse Recovery Journal of Information & Computational Science 11:9 (214) 2933 2939 June 1, 214 Available at http://www.joics.com Pre-weighted Matching Pursuit Algorithms for Sparse Recovery Jingfei He, Guiling Sun, Jie

More information

Combining geometry and combinatorics

Combining geometry and combinatorics Combining geometry and combinatorics A unified approach to sparse signal recovery Anna C. Gilbert University of Michigan joint work with R. Berinde (MIT), P. Indyk (MIT), H. Karloff (AT&T), M. Strauss

More information

Uncertainty principles and sparse approximation

Uncertainty principles and sparse approximation Uncertainty principles and sparse approximation In this lecture, we will consider the special case where the dictionary Ψ is composed of a pair of orthobases. We will see that our ability to find a sparse

More information

Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London

Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation Pier Luigi Dragotti Imperial College London Outline Part 1: Sparse Signal Representation ~90min Part 2: Sparse Sampling ~90min 2

More information

Noisy and Missing Data Regression: Distribution-Oblivious Support Recovery

Noisy and Missing Data Regression: Distribution-Oblivious Support Recovery : Distribution-Oblivious Support Recovery Yudong Chen Department of Electrical and Computer Engineering The University of Texas at Austin Austin, TX 7872 Constantine Caramanis Department of Electrical

More information

Robust multichannel sparse recovery

Robust multichannel sparse recovery Robust multichannel sparse recovery Esa Ollila Department of Signal Processing and Acoustics Aalto University, Finland SUPELEC, Feb 4th, 2015 1 Introduction 2 Nonparametric sparse recovery 3 Simulation

More information

Structured matrix factorizations. Example: Eigenfaces

Structured matrix factorizations. Example: Eigenfaces Structured matrix factorizations Example: Eigenfaces An extremely large variety of interesting and important problems in machine learning can be formulated as: Given a matrix, find a matrix and a matrix

More information

Sparse Subspace Clustering

Sparse Subspace Clustering Sparse Subspace Clustering Based on Sparse Subspace Clustering: Algorithm, Theory, and Applications by Elhamifar and Vidal (2013) Alex Gutierrez CSCI 8314 March 2, 2017 Outline 1 Motivation and Background

More information

Least squares regularized or constrained by L0: relationship between their global minimizers. Mila Nikolova

Least squares regularized or constrained by L0: relationship between their global minimizers. Mila Nikolova Least squares regularized or constrained by L0: relationship between their global minimizers Mila Nikolova CMLA, CNRS, ENS Cachan, Université Paris-Saclay, France nikolova@cmla.ens-cachan.fr SIAM Minisymposium

More information

Compressed Sensing and Related Learning Problems

Compressed Sensing and Related Learning Problems Compressed Sensing and Related Learning Problems Yingzhen Li Dept. of Mathematics, Sun Yat-sen University Advisor: Prof. Haizhang Zhang Advisor: Prof. Haizhang Zhang 1 / Overview Overview Background Compressed

More information

Sparse linear models and denoising

Sparse linear models and denoising Lecture notes 4 February 22, 2016 Sparse linear models and denoising 1 Introduction 1.1 Definition and motivation Finding representations of signals that allow to process them more effectively is a central

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 26, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 55 High dimensional

More information

Gradient Descent with Sparsification: An iterative algorithm for sparse recovery with restricted isometry property

Gradient Descent with Sparsification: An iterative algorithm for sparse recovery with restricted isometry property : An iterative algorithm for sparse recovery with restricted isometry property Rahul Garg grahul@us.ibm.com Rohit Khandekar rohitk@us.ibm.com IBM T. J. Watson Research Center, 0 Kitchawan Road, Route 34,

More information

Generalized greedy algorithms.

Generalized greedy algorithms. Generalized greedy algorithms. François-Xavier Dupé & Sandrine Anthoine LIF & I2M Aix-Marseille Université - CNRS - Ecole Centrale Marseille, Marseille ANR Greta Séminaire Parisien des Mathématiques Appliquées

More information

Strengthened Sobolev inequalities for a random subspace of functions

Strengthened Sobolev inequalities for a random subspace of functions Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)

More information

Improving the Incoherence of a Learned Dictionary via Rank Shrinkage

Improving the Incoherence of a Learned Dictionary via Rank Shrinkage Improving the Incoherence of a Learned Dictionary via Rank Shrinkage Shashanka Ubaru, Abd-Krim Seghouane 2 and Yousef Saad Department of Computer Science and Engineering, University of Minnesota, Twin

More information

Analysis of Greedy Algorithms

Analysis of Greedy Algorithms Analysis of Greedy Algorithms Jiahui Shen Florida State University Oct.26th Outline Introduction Regularity condition Analysis on orthogonal matching pursuit Analysis on forward-backward greedy algorithm

More information

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France. Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Overview of the course Introduction sparsity & data compression inverse problems

More information

When Dictionary Learning Meets Classification

When Dictionary Learning Meets Classification When Dictionary Learning Meets Classification Bufford, Teresa 1 Chen, Yuxin 2 Horning, Mitchell 3 Shee, Liberty 1 Mentor: Professor Yohann Tendero 1 UCLA 2 Dalhousie University 3 Harvey Mudd College August

More information

ECS289: Scalable Machine Learning

ECS289: Scalable Machine Learning ECS289: Scalable Machine Learning Cho-Jui Hsieh UC Davis Oct 27, 2015 Outline One versus all/one versus one Ranking loss for multiclass/multilabel classification Scaling to millions of labels Multiclass

More information

Designing Information Devices and Systems I Discussion 13B

Designing Information Devices and Systems I Discussion 13B EECS 6A Fall 7 Designing Information Devices and Systems I Discussion 3B. Orthogonal Matching Pursuit Lecture Orthogonal Matching Pursuit (OMP) algorithm: Inputs: A set of m songs, each of length n: S

More information

Thresholds for the Recovery of Sparse Solutions via L1 Minimization

Thresholds for the Recovery of Sparse Solutions via L1 Minimization Thresholds for the Recovery of Sparse Solutions via L Minimization David L. Donoho Department of Statistics Stanford University 39 Serra Mall, Sequoia Hall Stanford, CA 9435-465 Email: donoho@stanford.edu

More information