Book of Abstracts - Extract th Annual Meeting. March 23-27, 2015 Lecce, Italy. jahrestagung.gamm-ev.de

Size: px
Start display at page:

Download "Book of Abstracts - Extract th Annual Meeting. March 23-27, 2015 Lecce, Italy. jahrestagung.gamm-ev.de"

Transcription

1 GESELLSCHAFT für ANGEWANDTE MATHEMATIK und MECHANIK e.v. INTERNATIONAL ASSOCIATION of APPLIED MATHEMATICS and MECHANICS 86 th Annual Meeting of the International Association of Applied Mathematics and Mechanics March 23-27, 20 Lecce, Italy Book of Abstracts - Extract 20 jahrestagung.gamm-ev.de

2 Sun day 22 Time Scientific Program - Timetable Monday 23 Tuesday 24 Wednesday 25 Thursday 26 Friday 27 9: 10: ( in parallel) Moritz Diehl von Mises prize lecture ( in parallel) (14 in parallel) 11: 12: Registration Thomas Böhlke General Assembly Ferdinando Auricchio (11 in parallel) 13: Opening Univ. Chorus Performance Lunch Lunch Lunch Closing 14: : Prandtl Lecture Keith Moffatt Giovanni Galdi Enrique Zuazua Nikolaus Adams ( in parallel) Daniel Kressner Stanislaw Stupkiewicz Registration pre- opening 16: 17: 18: 19: 20: 21: Minisymposia & Young Reseachers' Minisymposia (10 in parallel) Opening reception at Castle of Charles V Poster session (14 in parallel) Public lecture Francesco D'Andria ( in parallel) Conference dinner at Hotel Tiziano Poster session ( in parallel)

3 GAMM 20 Università del Salento Table of contents YRMS4: Co-/Sparsity, Inverse Problems and Compressive Imaging 4 Recovering overcomplete sparse representations from structured sensing Needell - Krahmer - Ward Computational Aspects of Sparse Recovery Tillmann Joint reconstruction and segmentation from sparse Radon data Frikel - Storath - Weinmann - Unser Empirical phase transitions in sparsity-regularized X-ray CT Jørgensen Cosparse models and recovery algorithms for inverse problems in acoustics and electro-encephalography Bertin - Kitic - Albera - Gribonval

4 YRMS4 YRMS4: Co-/Sparsity, Inverse Problems and Compressive Imaging At the core of many inverse and imaging problems are signal models which in addition to stabilizing the solutions of these problems are also capable of dealing with undersampling. In this context, a very popular model is the sparse synthesis model. Here, the assumption is that the signals can be expressed as a linear combination of a few signal components from a dictionary. In the last decade, there has been a lot of theoretical and practical work dealing with this model in the field of compressive sensing. Despite their success, practical applications often do not meet the theoretical conditions that compressive sensing relies on. Hence, their success is yet to be mathematically grounded. The goal of this mini-symposium is to draw attention to the theoretical gap between theory and practice, and to give young researchers working in inverse problems, imaging and compressed sensing the opportunity to present their new results. GAMM 20 4

5 YRMS4 Monday, March 23 16:-17:10 (Giotto Room) Needell Recovering overcomplete sparse representations from structured sensing Felix Krahmer 1, Deanna Needell 2, Rachel Ward 3 1 Inst. Numerische und Angewandte Mathematik, Universität Göttingen 2 Claremont McKenna College 3 University of Texas at Austin In many signal processing applications, one wishes to acquire images that are approximately sparse in transform domains such as wavelets using frequency domain samples. Often the quality of the sparsity based model significantly improves when one considers redundant representation systems such as wavelet frames. To date, compressed sensing with redundant representation systems has, however, only been studied for measurement systems that have certain concentration properties, which is not the case for frequency domain samples. In this talk, we close this gap, providing more general reconstruction guarantees for signals that are sparse with respect to redundant systems. GAMM 20 5

6 YRMS4 Monday, March 23 17:10-17: (Giotto Room) Tillmann Computational Aspects of Sparse Recovery Andreas M. Tillmann Technische Universität Braunschweig The fundamental problem of sparse recovery from underdetermined linear measurements reads min x 0 such that Ax b δ, (1) where δ 0 is an estimate of measurement noise, x 0 denotes the number of nonzeros in x (i.e., the support size) and is some norm (often the l 2 -norm). This problem is well-known to be NP-hard in the strong sense and also difficult to approximate, even in the noiseless case δ = 0. In this talk, we discuss a selection of aspects from the sparse recovery context, with a focus on computational matters: Replacing the discrete objective by the l 1 -norm has become a standard approach to obtain sparse solutions, backed by empirical success as well as theoretical guarantees for when this strategy in fact solves (1). For the noiseless basis pursuit model min x 1 such that Ax = b, (2) in the presence of a sparse solution, a simple "heuristic optimality check" (HOC) procedure can improve both solution speed and quality of many different problem-specific algorithms [1]. We extend the HOC idea to several noise-aware l 1 -minimization models (for both the synthesis and analysis approach) and provide preliminary numerical results that demonstrate the potential effectiveness of HOC schemes in this context. It is well-known that basis pursuit (2) can be written as a linear program. We show that the converse result is also true (i.e., every LP can be transformed into a basis pursuit problem via a polynomial reduction) and sketch possible applications and complexity-theoretical implications. In general, verifying that a relaxation or heuristic such as basis pursuit indeed provides the sparsest possible solution is often as hard as solving (1) itself, since evaluating the strongest known conditions that guarantee recovery success for heuristics is also NP-hard [2, 3]. On the one hand, this gives rise to the question whether there are other polynomially solvable special cases of (1) besides those identified by the known recovery conditions. On the other, it motivates tackling (1) directly by exact methods from discrete and/or combinatorial optimization. In fact, there seem to be only very few attempts to solve (1) directly. Continuing the work from [4], we investigate branch-and-cut methods for binary set-covering integer program reformulations of the sparse recovery task. In this context, to make use of the observation that (measurement) equations with few nonzero coefficients may provide stronger information regarding the optimal support, we encounter another interesting combinatorial problem named matrix sparsification, which can also be seen as a special dictionary learning problem [5]. We briefly discuss work-in-progress on the branch-and-cut approach and the connection to matrix sparsification. References [1] D. A. Lorenz, M. E. Pfetsch, A. M. Tillmann. Solving basis pursuit: Subgradient algorithm, heuristic optimality check, and solver comparison. ACM T. Math. Software, to appear (2014). [2] A. M. Tillmann. Computational aspects of compressed sensing. Doctoral dissertation, TU Darmstadt, [3] A. M. Tillmann, M. E. Pfetsch. The computational complexity of the restricted isometry property, the nullspace property, and related concepts in compressed sensing. IEEE Trans. Inf. Theory 60(2) (2014), [4] S. Jokar, M. E. Pfetsch. Exact and approximate sparse solutions of underdetermined linear equations. SIAM J. Sci. Comput. 31(1) (2008), [5] A. M. Tillmann. On the computational intractability of exact and approximate dictionary learning. IEEE Signal Process. Lett. 22(1) (20), 49. GAMM 20 6

7 YRMS4 Monday, March 23 17:-17:50 (Giotto Room) Frikel Joint reconstruction and segmentation from sparse Radon data Martin Storath 1, Andreas Weinmann 2, Jürgen Frikel 2, and Michael Unser 1 1 Biomedical Imaging Group, EPFL Lausanne, Switzerland 2 Helmholtz Center Munich, Germany In many tomographic imaging setups the reconstruction and segmentation are classically performed in two separate steps. In such cases, the performance of the segmentation step greatly depends on the quality of the reconstruction and, hence, often yields unsatisfactory results. In particular, if only a few noisy projections are available, such strategies lead to particularly poor performance. To overcome this, combined reconstruction and segmentation (one-step) strategies have shown to yield promising results. In this talk, we present an approach for joint reconstruction and segmentation using the Potts model, cf. [1]. More specifically, we propose a new algorithmic approach to the non-smooth and nonconvex Potts problem (also called piecewise-constant Mumford-Shah problem) for inverse imaging problems. We derive a suitable splitting into specific subproblems that can all be solved efficiently. Our method does not require a priori knowledge on the gray levels nor on the number of segments of the reconstruction. Further, it avoids anisotropic artifacts such as geometric staircasing. We demonstrate the suitability of our method for joint image reconstruction and segmentation. We focus on Radon data, where we in particular consider sparse angle data situations. For instance, our method is able to recover all segments of the Shepp-Logan phantom from 7 angular views only. As further applications, we also consider reconstructions from spherical Radon data which arises in photoacoustic tomography. References [1] M. Storath, A. Weinmann, J. Frikel, and M. Unser. Joint image reconstruction and segmentation using the Potts model. To appear in Inverse Problems, Preprint at arxiv: GAMM 20 7

8 YRMS4 Monday, March 23 17:50-18:10 (Giotto Room) Jørgensen Empirical phase transitions in sparsity-regularized X-ray CT Jakob S. Jørgensen Technical University of Denmark, Denmark Sparsity regularization in X-ray computed tomography (CT) has shown large potential for accurate reconstruction from reduced data, see e.g. [1, 2], leading to substantially reduced patient x-ray exposure in medical imaging and shorter scan times in materials science. One driving factor for research in sparsity-regularized methods has been developments in compressed sensing (CS), connecting the possible undersampling level to the image sparsity level. CS offers guarantees of accurate reconstruction of sparse images from reduced data under suitable assumptions on the measurement matrix. However, existing CS guarantees, e.g. based on the restricted isometry property (RIP) do not apply to the structured and sparse measurement matrices of X-ray CT, while other coherence-based guarantees lead to essentially useless bounds. The empirical success of sparsity regularization for X-ray CT therefore remains unexplained from a theoretical viewpoint. Using empirical phase diagrams we study image recoverability from X-ray CT data as function of image sparsity and undersampling level. We consider sparsity in the image and gradient domains and reconstruction by 1-norm and TV regularization. We demonstrate empirically that X-ray CT exhibits a pronounced relation between image sparsity and the possible undersampling level as well as sharp phase transitions for certain image classes. We demonstrate that recovery performance of X-ray CT is almost comparable with the standard CS Gaussian matrix ensemble, which suggests a similarity with CS. However, we also highlight some important differences, for example, that structure in the non-zero locations of the image and not just sparsity level affects recoverability in X-ray CT. Furthermore we investigate the phase-transition behavior theoretically for a simple class of images consisting of sparse superpositions of radial basis functions (RBFs). RBFs are an often-used alternative to pixels and voxels for image representation in X-ray CT and commonly referred to as blobs. The use of RBFs in X-ray CT image reconstruction is motivated by a reduction of model errors arising from anisotropies of the rectangular pixels and voxels. The rotational symmetry of RBFs implies a regularity that can be exploited to analyze the setup from a CS perspective. The analysis exploits non-negativity of the sampling matrix and the signal and utilizes new mathematical tools from CS theory in the form of expander graphs. This approach has recently been used to derive average-case recovery guarantees for certain restricted discrete tomography setups [6]. Here, we address how to generalize the approach to the more complicated scanning setups used in standard (non-discrete) tomography, for example parallel and fan-beam geometries. We compare the theoretical recovery guarantees with empirical phase diagrams. References [1] E.Y. Sidky, X. Pan. Image reconstruction in circular cone-beam computed tomography by constrained, total-variation minimization. Phys. Med. Biol. 53 (2008), [2] J. Bian, J.H. Siewerdsen, X. Han, E.Y. Sidky, J.L. Prince, C.A. Pelizzari, X. Pan. Evaluation of sparse-view reconstruction from flat-panel-detector cone-beam CT. Phys. Med. Biol. 55 (2010), [3] J.S. Jørgensen, E.Y. Sidky. How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray CT. Submitted (2014). [4] J.S. Jørgensen, C. Kruschel, D.A. Lorenz. Testable uniqueness conditions for empirical assessment of undersampling levels in total variation-regularized X-ray CT. Inverse Probl. Sci. Eng. (2014), 23 pp. DOI: / [5] J.S. Jørgensen, E.Y. Sidky, P.C. Hansen, X. Pan. Empirical average-case relation between undersampling and sparsity in X-ray CT. Submitted (2014). Available from: [6] S. Petra, C. Schnörr: Average case recovery analysis of tomographic compressed sensing. Linear Algebra Appl. 441 (2014), GAMM 20 8

9 YRMS4 Monday, March 23 18:10-18: (Giotto Room) Bertin Cosparse models and recovery algorithms for inverse problems in acoustics and electro-encephalography Srđan Kitić 1, Laurent Albera 1,2,3, Nancy Bertin 4,1, Rémi Gribonval 1 1 Inria, Centre Inria Rennes Bretagne Atlantique 2 Inserm UMR LTSI - Université Rennes 1 4 IRISA - CNRS UMR 6074 Sparse data models are powerful tools for solving ill-posed inverse problems. In their most general formulation, linear inverse problems can be expressed as the problem of recovering a signal x R n from measurements y R m : y Mx, (1) where M R m n is a transfer matrix modeling the signal acquisition process. In most cases, this problem is ill-posed, and solving it implies the use of additional knowledge in the problem formulation, in order to regularize it. Sparse data models have emerged as a pervasive tool to perform such regularization. The now ubiquitous sparse synthesis model states that the signal (x R n ) is constructed by a linear combination of a few column vectors (atoms) taken from a dictionary D R n d, such that x = Ds, (2) where the coefficient vector s contains very few non-zero elements and is called a sparse representation of x. A counterpart of sparse synthesis is the sparse analysis or cosparse data model, which has gained attraction more recently [1]. This model assumes that there exist an analysis operator A R b n (b n), such that the following analysis representation: z = Ax (3) of the signal x is sparse. A signal x whose analysis representation z = Ax R b contains l zero elements is said to be l-cosparse. The analysis and the synthesis sparse models are nominally equivalent in only one special case: when A = D 1, where the analysis operator and the dictionary are square, non-singular matrices [2]. In this presentation, we will show two settings where knowledge at hand is encoded through known Partial Differential Equations (PDEs), governing the underlying physical phenomena, which can be used to design the dictionary D (in the sparse synthesis case) or the analysis operator A (in the cosparse analysis model). We will present the resulting regularization framework based on the sparse analysis models for two problems governed by linear partial differential equations: acoustic pressure source localization [4], governed by the wave equation, and brain electrical current source localization [3], which can be described by the Poisson equation. The cosparse model can very naturally incorporate this physical knowledge in an analysis operator A resulting from a direct discretization of the underlying PDE. When solved by specially tailored optimization algorithm based on Alternating Direction Method of Multipliers, the underlying inverse problems can be solved efficiently and with much better scalability than their synthesis-based counterpart formulations. References [1] S. Nam, M.E. Davies, M. Elad, R. Gribonval. The cosparse analysis model and algorithms. Applied and Computational Harmonic Analysis 34 (2013). [2] M. Elad, P. Milanfar, R. Rubinstein. Analysis versus synthesis in signal priors. Inverse problems 23 (2007). [3] L. Albera, S. Kitić, N. Bertin, G. Puy, R. Gribonval. Brain source localization using a physics-driven structured cosparse representation of EEG signals. MLSP (2014). [4] S. Kitić, N. Bertin, R. Gribonval. Hearing behind walls: localizing sources in the room next door with cosparsity. ICASSP (2014). GAMM 20 9

Book of Abstracts - Extract th Annual Meeting. March 23-27, 2015 Lecce, Italy. jahrestagung.gamm-ev.de

Book of Abstracts - Extract th Annual Meeting. March 23-27, 2015 Lecce, Italy. jahrestagung.gamm-ev.de GESELLSCHAFT für ANGEWANDTE MATHEMATIK und MECHANIK e.v. INTERNATIONAL ASSOCIATION of APPLIED MATHEMATICS and MECHANICS 86 th Annual Meeting of the International Association of Applied Mathematics and

More information

Book of Abstracts - Extract th Annual Meeting. March 23-27, 2015 Lecce, Italy. jahrestagung.gamm-ev.de

Book of Abstracts - Extract th Annual Meeting. March 23-27, 2015 Lecce, Italy. jahrestagung.gamm-ev.de GESELLSCHAFT für ANGEWANDTE MATHEMATIK und MECHANIK e.v. INTERNATIONAL ASSOCIATION of APPLIED MATHEMATICS and MECHANICS 86 th Annual Meeting of the International Association of Applied Mathematics and

More information

The Analysis Cosparse Model for Signals and Images

The Analysis Cosparse Model for Signals and Images The Analysis Cosparse Model for Signals and Images Raja Giryes Computer Science Department, Technion. The research leading to these results has received funding from the European Research Council under

More information

Recovering overcomplete sparse representations from structured sensing

Recovering overcomplete sparse representations from structured sensing Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix

More information

Strengthened Sobolev inequalities for a random subspace of functions

Strengthened Sobolev inequalities for a random subspace of functions Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)

More information

2D X-Ray Tomographic Reconstruction From Few Projections

2D X-Ray Tomographic Reconstruction From Few Projections 2D X-Ray Tomographic Reconstruction From Few Projections Application of Compressed Sensing Theory CEA-LID, Thalès, UJF 6 octobre 2009 Introduction Plan 1 Introduction 2 Overview of Compressed Sensing Theory

More information

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional

More information

GREEDY SIGNAL RECOVERY REVIEW

GREEDY SIGNAL RECOVERY REVIEW GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin

More information

A Generalized Uncertainty Principle and Sparse Representation in Pairs of Bases

A Generalized Uncertainty Principle and Sparse Representation in Pairs of Bases 2558 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 48, NO 9, SEPTEMBER 2002 A Generalized Uncertainty Principle Sparse Representation in Pairs of Bases Michael Elad Alfred M Bruckstein Abstract An elementary

More information

EUSIPCO

EUSIPCO EUSIPCO 013 1569746769 SUBSET PURSUIT FOR ANALYSIS DICTIONARY LEARNING Ye Zhang 1,, Haolong Wang 1, Tenglong Yu 1, Wenwu Wang 1 Department of Electronic and Information Engineering, Nanchang University,

More information

A simple test to check the optimality of sparse signal approximations

A simple test to check the optimality of sparse signal approximations A simple test to check the optimality of sparse signal approximations Rémi Gribonval, Rosa Maria Figueras I Ventura, Pierre Vergheynst To cite this version: Rémi Gribonval, Rosa Maria Figueras I Ventura,

More information

CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles

CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles SIAM Student Research Conference Deanna Needell Joint work with Roman Vershynin and Joel Tropp UC Davis, May 2008 CoSaMP: Greedy Signal

More information

Sparse Solutions of an Undetermined Linear System

Sparse Solutions of an Undetermined Linear System 1 Sparse Solutions of an Undetermined Linear System Maddullah Almerdasy New York University Tandon School of Engineering arxiv:1702.07096v1 [math.oc] 23 Feb 2017 Abstract This work proposes a research

More information

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence

More information

COMPRESSED Sensing (CS) is a method to recover a

COMPRESSED Sensing (CS) is a method to recover a 1 Sample Complexity of Total Variation Minimization Sajad Daei, Farzan Haddadi, Arash Amini Abstract This work considers the use of Total Variation (TV) minimization in the recovery of a given gradient

More information

SPARSE signal representations have gained popularity in recent

SPARSE signal representations have gained popularity in recent 6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying

More information

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France. Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Overview of the course Introduction sparsity & data compression inverse problems

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

Thresholds for the Recovery of Sparse Solutions via L1 Minimization

Thresholds for the Recovery of Sparse Solutions via L1 Minimization Thresholds for the Recovery of Sparse Solutions via L Minimization David L. Donoho Department of Statistics Stanford University 39 Serra Mall, Sequoia Hall Stanford, CA 9435-465 Email: donoho@stanford.edu

More information

Exact Low-rank Matrix Recovery via Nonconvex M p -Minimization

Exact Low-rank Matrix Recovery via Nonconvex M p -Minimization Exact Low-rank Matrix Recovery via Nonconvex M p -Minimization Lingchen Kong and Naihua Xiu Department of Applied Mathematics, Beijing Jiaotong University, Beijing, 100044, People s Republic of China E-mail:

More information

An algebraic perspective on integer sparse recovery

An algebraic perspective on integer sparse recovery An algebraic perspective on integer sparse recovery Lenny Fukshansky Claremont McKenna College (joint work with Deanna Needell and Benny Sudakov) Combinatorics Seminar USC October 31, 2018 From Wikipedia:

More information

Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France

Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Structure of the tutorial Session 1: Introduction to inverse problems & sparse

More information

Signal Sparsity Models: Theory and Applications

Signal Sparsity Models: Theory and Applications Signal Sparsity Models: Theory and Applications Raja Giryes Computer Science Department, Technion Michael Elad, Technion, Haifa Israel Sangnam Nam, CMI, Marseille France Remi Gribonval, INRIA, Rennes France

More information

LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING. Saiprasad Ravishankar and Yoram Bresler

LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING. Saiprasad Ravishankar and Yoram Bresler LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING Saiprasad Ravishankar and Yoram Bresler Department of Electrical and Computer Engineering and the Coordinated Science Laboratory, University

More information

A FISTA-like scheme to accelerate GISTA?

A FISTA-like scheme to accelerate GISTA? A FISTA-like scheme to accelerate GISTA? C. Cloquet 1, I. Loris 2, C. Verhoeven 2 and M. Defrise 1 1 Dept. of Nuclear Medicine, Vrije Universiteit Brussel 2 Dept. of Mathematics, Université Libre de Bruxelles

More information

The Cosparse Analysis Model and Algorithms

The Cosparse Analysis Model and Algorithms The Cosparse Analysis Model and Algorithms S. Nam a, M. E. Davies b, M. Elad c, R. Gribonval a a Centre de Recherche INRIA Rennes - Bretagne Atlantique, Campus de Beaulieu, F-35042 Rennes, France b School

More information

IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER

IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER 2015 1239 Preconditioning for Underdetermined Linear Systems with Sparse Solutions Evaggelia Tsiligianni, StudentMember,IEEE, Lisimachos P. Kondi,

More information

At-A-Glance SIAM Events Mobile App

At-A-Glance SIAM Events Mobile App At-A-Glance SIAM Events Mobile App Scan the QR code with any QR reader and download the TripBuilder EventMobile app to your iphone, ipad, itouch or Android mobile device.you can also visit www.tripbuildermedia.com/apps/siamevents

More information

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp CoSaMP Iterative signal recovery from incomplete and inaccurate samples Joel A. Tropp Applied & Computational Mathematics California Institute of Technology jtropp@acm.caltech.edu Joint with D. Needell

More information

Physics-driven structured cosparse modeling for source localization

Physics-driven structured cosparse modeling for source localization Physics-driven structured cosparse modeling for source localization Sangnam Nam, Rémi Gribonval To cite this version: Sangnam Nam, Rémi Gribonval. Physics-driven structured cosparse modeling for source

More information

The Cosparse Analysis Model and Algorithms

The Cosparse Analysis Model and Algorithms The Cosparse Analysis Model and Algorithms Sangnam Nam, Mike E. Davies, Michael Elad, Rémi Gribonval To cite this version: Sangnam Nam, Mike E. Davies, Michael Elad, Rémi Gribonval. The Cosparse Analysis

More information

Lecture Notes 9: Constrained Optimization

Lecture Notes 9: Constrained Optimization Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form

More information

c 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE

c 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE METHODS AND APPLICATIONS OF ANALYSIS. c 2011 International Press Vol. 18, No. 1, pp. 105 110, March 2011 007 EXACT SUPPORT RECOVERY FOR LINEAR INVERSE PROBLEMS WITH SPARSITY CONSTRAINTS DENNIS TREDE Abstract.

More information

TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS

TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS Martin Kleinsteuber and Simon Hawe Department of Electrical Engineering and Information Technology, Technische Universität München, München, Arcistraße

More information

Bayesian Paradigm. Maximum A Posteriori Estimation

Bayesian Paradigm. Maximum A Posteriori Estimation Bayesian Paradigm Maximum A Posteriori Estimation Simple acquisition model noise + degradation Constraint minimization or Equivalent formulation Constraint minimization Lagrangian (unconstraint minimization)

More information

Equivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation

Equivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 19, NO. 12, DECEMBER 2008 2009 Equivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation Yuanqing Li, Member, IEEE, Andrzej Cichocki,

More information

Lecture 22: More On Compressed Sensing

Lecture 22: More On Compressed Sensing Lecture 22: More On Compressed Sensing Scribed by Eric Lee, Chengrun Yang, and Sebastian Ament Nov. 2, 207 Recap and Introduction Basis pursuit was the method of recovering the sparsest solution to an

More information

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin

More information

Sketching for Large-Scale Learning of Mixture Models

Sketching for Large-Scale Learning of Mixture Models Sketching for Large-Scale Learning of Mixture Models Nicolas Keriven Université Rennes 1, Inria Rennes Bretagne-atlantique Adv. Rémi Gribonval Outline Introduction Practical Approach Results Theoretical

More information

A New Estimate of Restricted Isometry Constants for Sparse Solutions

A New Estimate of Restricted Isometry Constants for Sparse Solutions A New Estimate of Restricted Isometry Constants for Sparse Solutions Ming-Jun Lai and Louis Y. Liu January 12, 211 Abstract We show that as long as the restricted isometry constant δ 2k < 1/2, there exist

More information

Optimization for Compressed Sensing

Optimization for Compressed Sensing Optimization for Compressed Sensing Robert J. Vanderbei 2014 March 21 Dept. of Industrial & Systems Engineering University of Florida http://www.princeton.edu/ rvdb Lasso Regression The problem is to solve

More information

The convex algebraic geometry of rank minimization

The convex algebraic geometry of rank minimization The convex algebraic geometry of rank minimization Pablo A. Parrilo Laboratory for Information and Decision Systems Massachusetts Institute of Technology International Symposium on Mathematical Programming

More information

Pre-weighted Matching Pursuit Algorithms for Sparse Recovery

Pre-weighted Matching Pursuit Algorithms for Sparse Recovery Journal of Information & Computational Science 11:9 (214) 2933 2939 June 1, 214 Available at http://www.joics.com Pre-weighted Matching Pursuit Algorithms for Sparse Recovery Jingfei He, Guiling Sun, Jie

More information

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011 Compressed Sensing Huichao Xue CS3750 Fall 2011 Table of Contents Introduction From News Reports Abstract Definition How it works A review of L 1 norm The Algorithm Backgrounds for underdetermined linear

More information

CS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5

CS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5 CS 229r: Algorithms for Big Data Fall 215 Prof. Jelani Nelson Lecture 19 Nov 5 Scribe: Abdul Wasay 1 Overview In the last lecture, we started discussing the problem of compressed sensing where we are given

More information

Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit

Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit Claremont Colleges Scholarship @ Claremont CMC Faculty Publications and Research CMC Faculty Scholarship 6-5-2008 Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit

More information

Design of Projection Matrix for Compressive Sensing by Nonsmooth Optimization

Design of Projection Matrix for Compressive Sensing by Nonsmooth Optimization Design of Proection Matrix for Compressive Sensing by Nonsmooth Optimization W.-S. Lu T. Hinamoto Dept. of Electrical & Computer Engineering Graduate School of Engineering University of Victoria Hiroshima

More information

Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm

Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm J. K. Pant, W.-S. Lu, and A. Antoniou University of Victoria August 25, 2011 Compressive Sensing 1 University

More information

Compressive Sensing (CS)

Compressive Sensing (CS) Compressive Sensing (CS) Luminita Vese & Ming Yan lvese@math.ucla.edu yanm@math.ucla.edu Department of Mathematics University of California, Los Angeles The UCLA Advanced Neuroimaging Summer Program (2014)

More information

Error Correction via Linear Programming

Error Correction via Linear Programming Error Correction via Linear Programming Emmanuel Candes and Terence Tao Applied and Computational Mathematics, Caltech, Pasadena, CA 91125 Department of Mathematics, University of California, Los Angeles,

More information

Greedy Signal Recovery and Uniform Uncertainty Principles

Greedy Signal Recovery and Uniform Uncertainty Principles Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles

More information

Exponential decay of reconstruction error from binary measurements of sparse signals

Exponential decay of reconstruction error from binary measurements of sparse signals Exponential decay of reconstruction error from binary measurements of sparse signals Deanna Needell Joint work with R. Baraniuk, S. Foucart, Y. Plan, and M. Wootters Outline Introduction Mathematical Formulation

More information

Generalized Orthogonal Matching Pursuit- A Review and Some

Generalized Orthogonal Matching Pursuit- A Review and Some Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents

More information

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28 Sparsity Models Tong Zhang Rutgers University T. Zhang (Rutgers) Sparsity Models 1 / 28 Topics Standard sparse regression model algorithms: convex relaxation and greedy algorithm sparse recovery analysis:

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk Model-Based Compressive Sensing for Signal Ensembles Marco F. Duarte Volkan Cevher Richard G. Baraniuk Concise Signal Structure Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional

More information

INDUSTRIAL MATHEMATICS INSTITUTE. B.S. Kashin and V.N. Temlyakov. IMI Preprint Series. Department of Mathematics University of South Carolina

INDUSTRIAL MATHEMATICS INSTITUTE. B.S. Kashin and V.N. Temlyakov. IMI Preprint Series. Department of Mathematics University of South Carolina INDUSTRIAL MATHEMATICS INSTITUTE 2007:08 A remark on compressed sensing B.S. Kashin and V.N. Temlyakov IMI Preprint Series Department of Mathematics University of South Carolina A remark on compressed

More information

Inverse problems, Dictionary based Signal Models and Compressed Sensing

Inverse problems, Dictionary based Signal Models and Compressed Sensing Inverse problems, Dictionary based Signal Models and Compressed Sensing Rémi Gribonval METISS project-team (audio signal processing, speech recognition, source separation) INRIA, Rennes, France Ecole d

More information

Rui ZHANG Song LI. Department of Mathematics, Zhejiang University, Hangzhou , P. R. China

Rui ZHANG Song LI. Department of Mathematics, Zhejiang University, Hangzhou , P. R. China Acta Mathematica Sinica, English Series May, 015, Vol. 31, No. 5, pp. 755 766 Published online: April 15, 015 DOI: 10.1007/s10114-015-434-4 Http://www.ActaMath.com Acta Mathematica Sinica, English Series

More information

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Jorge F. Silva and Eduardo Pavez Department of Electrical Engineering Information and Decision Systems Group Universidad

More information

Spin Glass Approach to Restricted Isometry Constant

Spin Glass Approach to Restricted Isometry Constant Spin Glass Approach to Restricted Isometry Constant Ayaka Sakata 1,Yoshiyuki Kabashima 2 1 Institute of Statistical Mathematics 2 Tokyo Institute of Technology 1/29 Outline Background: Compressed sensing

More information

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications Class 19: Data Representation by Design What is data representation? Let X be a data-space X M (M) F (M) X A data representation

More information

Compressed Sensing Using Reed- Solomon and Q-Ary LDPC Codes

Compressed Sensing Using Reed- Solomon and Q-Ary LDPC Codes Compressed Sensing Using Reed- Solomon and Q-Ary LDPC Codes Item Type text; Proceedings Authors Jagiello, Kristin M. Publisher International Foundation for Telemetering Journal International Telemetering

More information

Stable Signal Recovery from Incomplete and Inaccurate Measurements

Stable Signal Recovery from Incomplete and Inaccurate Measurements Stable Signal Recovery from Incomplete and Inaccurate Measurements EMMANUEL J. CANDÈS California Institute of Technology JUSTIN K. ROMBERG California Institute of Technology AND TERENCE TAO University

More information

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 Machine Learning for Signal Processing Sparse and Overcomplete Representations Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 1 Key Topics in this Lecture Basics Component-based representations

More information

Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit

Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit Deanna Needell and Roman Vershynin Abstract We demonstrate a simple greedy algorithm that can reliably

More information

SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD

SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD EE-731: ADVANCED TOPICS IN DATA SCIENCES LABORATORY FOR INFORMATION AND INFERENCE SYSTEMS SPRING 2016 INSTRUCTOR: VOLKAN CEVHER SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD STRUCTURED SPARSITY

More information

Trends in hybrid data tomography Workshop at DTU Compute Wednesday January 24, 2018 Room 324/050

Trends in hybrid data tomography Workshop at DTU Compute Wednesday January 24, 2018 Room 324/050 Trends in hybrid data tomography Workshop at DTU Compute Wednesday January 24, 2018 Room 324/050 Program 09:00 09:50 Stochastic Gradient Descent for Inverse Problems Bangti Jin, University College London

More information

Wavelet Footprints: Theory, Algorithms, and Applications

Wavelet Footprints: Theory, Algorithms, and Applications 1306 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 51, NO. 5, MAY 2003 Wavelet Footprints: Theory, Algorithms, and Applications Pier Luigi Dragotti, Member, IEEE, and Martin Vetterli, Fellow, IEEE Abstract

More information

Beyond incoherence and beyond sparsity: compressed sensing in the real world

Beyond incoherence and beyond sparsity: compressed sensing in the real world Beyond incoherence and beyond sparsity: compressed sensing in the real world Clarice Poon 1st November 2013 University of Cambridge, UK Applied Functional and Harmonic Analysis Group Head of Group Anders

More information

Random Sampling of Bandlimited Signals on Graphs

Random Sampling of Bandlimited Signals on Graphs Random Sampling of Bandlimited Signals on Graphs Pierre Vandergheynst École Polytechnique Fédérale de Lausanne (EPFL) School of Engineering & School of Computer and Communication Sciences Joint work with

More information

Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit

Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit arxiv:0707.4203v2 [math.na] 14 Aug 2007 Deanna Needell Department of Mathematics University of California,

More information

Sparsity in Underdetermined Systems

Sparsity in Underdetermined Systems Sparsity in Underdetermined Systems Department of Statistics Stanford University August 19, 2005 Classical Linear Regression Problem X n y p n 1 > Given predictors and response, y Xβ ε = + ε N( 0, σ 2

More information

Structured matrix factorizations. Example: Eigenfaces

Structured matrix factorizations. Example: Eigenfaces Structured matrix factorizations Example: Eigenfaces An extremely large variety of interesting and important problems in machine learning can be formulated as: Given a matrix, find a matrix and a matrix

More information

Statistical Issues in Searches: Photon Science Response. Rebecca Willett, Duke University

Statistical Issues in Searches: Photon Science Response. Rebecca Willett, Duke University Statistical Issues in Searches: Photon Science Response Rebecca Willett, Duke University 1 / 34 Photon science seen this week Applications tomography ptychography nanochrystalography coherent diffraction

More information

Noisy Signal Recovery via Iterative Reweighted L1-Minimization

Noisy Signal Recovery via Iterative Reweighted L1-Minimization Noisy Signal Recovery via Iterative Reweighted L1-Minimization Deanna Needell UC Davis / Stanford University Asilomar SSC, November 2009 Problem Background Setup 1 Suppose x is an unknown signal in R d.

More information

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER On the Performance of Sparse Recovery

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER On the Performance of Sparse Recovery IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER 2011 7255 On the Performance of Sparse Recovery Via `p-minimization (0 p 1) Meng Wang, Student Member, IEEE, Weiyu Xu, and Ao Tang, Senior

More information

Variational methods for restoration of phase or orientation data

Variational methods for restoration of phase or orientation data Variational methods for restoration of phase or orientation data Martin Storath joint works with Laurent Demaret, Michael Unser, Andreas Weinmann Image Analysis and Learning Group Universität Heidelberg

More information

Towards a Mathematical Theory of Super-resolution

Towards a Mathematical Theory of Super-resolution Towards a Mathematical Theory of Super-resolution Carlos Fernandez-Granda www.stanford.edu/~cfgranda/ Information Theory Forum, Information Systems Laboratory, Stanford 10/18/2013 Acknowledgements This

More information

arxiv: v1 [math.na] 26 Nov 2009

arxiv: v1 [math.na] 26 Nov 2009 Non-convexly constrained linear inverse problems arxiv:0911.5098v1 [math.na] 26 Nov 2009 Thomas Blumensath Applied Mathematics, School of Mathematics, University of Southampton, University Road, Southampton,

More information

A discretized Newton flow for time varying linear inverse problems

A discretized Newton flow for time varying linear inverse problems A discretized Newton flow for time varying linear inverse problems Martin Kleinsteuber and Simon Hawe Department of Electrical Engineering and Information Technology, Technische Universität München Arcisstrasse

More information

Sparsity and Compressed Sensing

Sparsity and Compressed Sensing Sparsity and Compressed Sensing Jalal Fadili Normandie Université-ENSICAEN, GREYC Mathematical coffees 2017 Recap: linear inverse problems Dictionary Sensing Sensing Sensing = y m 1 H m n y y 2 R m H A

More information

Sparse Recovery Beyond Compressed Sensing

Sparse Recovery Beyond Compressed Sensing Sparse Recovery Beyond Compressed Sensing Carlos Fernandez-Granda www.cims.nyu.edu/~cfgranda Applied Math Colloquium, MIT 4/30/2018 Acknowledgements Project funded by NSF award DMS-1616340 Separable Nonlinear

More information

Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm

Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm Jeevan K. Pant, Wu-Sheng Lu, and Andreas Antoniou University of Victoria May 21, 2012 Compressive Sensing 1/23

More information

of Orthogonal Matching Pursuit

of Orthogonal Matching Pursuit A Sharp Restricted Isometry Constant Bound of Orthogonal Matching Pursuit Qun Mo arxiv:50.0708v [cs.it] 8 Jan 205 Abstract We shall show that if the restricted isometry constant (RIC) δ s+ (A) of the measurement

More information

Fast Hard Thresholding with Nesterov s Gradient Method

Fast Hard Thresholding with Nesterov s Gradient Method Fast Hard Thresholding with Nesterov s Gradient Method Volkan Cevher Idiap Research Institute Ecole Polytechnique Federale de ausanne volkan.cevher@epfl.ch Sina Jafarpour Department of Computer Science

More information

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION A Thesis by MELTEM APAYDIN Submitted to the Office of Graduate and Professional Studies of Texas A&M University in partial fulfillment of the

More information

Machine Learning for Signal Processing Sparse and Overcomplete Representations

Machine Learning for Signal Processing Sparse and Overcomplete Representations Machine Learning for Signal Processing Sparse and Overcomplete Representations Abelino Jimenez (slides from Bhiksha Raj and Sourish Chaudhuri) Oct 1, 217 1 So far Weights Data Basis Data Independent ICA

More information

An Introduction to Sparse Approximation

An Introduction to Sparse Approximation An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,

More information

Which wavelet bases are the best for image denoising?

Which wavelet bases are the best for image denoising? Which wavelet bases are the best for image denoising? Florian Luisier a, Thierry Blu a, Brigitte Forster b and Michael Unser a a Biomedical Imaging Group (BIG), Ecole Polytechnique Fédérale de Lausanne

More information

Introduction to the Mathematics of Medical Imaging

Introduction to the Mathematics of Medical Imaging Introduction to the Mathematics of Medical Imaging Second Edition Charles L. Epstein University of Pennsylvania Philadelphia, Pennsylvania EiaJTL Society for Industrial and Applied Mathematics Philadelphia

More information

5742 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 12, DECEMBER /$ IEEE

5742 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 12, DECEMBER /$ IEEE 5742 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 12, DECEMBER 2009 Uncertainty Relations for Shift-Invariant Analog Signals Yonina C. Eldar, Senior Member, IEEE Abstract The past several years

More information

On the l 1 -Norm Invariant Convex k-sparse Decomposition of Signals

On the l 1 -Norm Invariant Convex k-sparse Decomposition of Signals On the l 1 -Norm Invariant Convex -Sparse Decomposition of Signals arxiv:1305.6021v2 [cs.it] 11 Nov 2013 Guangwu Xu and Zhiqiang Xu Abstract Inspired by an interesting idea of Cai and Zhang, we formulate

More information

Compressive Sensing and Beyond

Compressive Sensing and Beyond Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered

More information

On the coherence barrier and analogue problems in compressed sensing

On the coherence barrier and analogue problems in compressed sensing On the coherence barrier and analogue problems in compressed sensing Clarice Poon University of Cambridge June 1, 2017 Joint work with: Ben Adcock Anders Hansen Bogdan Roman (Simon Fraser) (Cambridge)

More information

Super-resolution via Convex Programming

Super-resolution via Convex Programming Super-resolution via Convex Programming Carlos Fernandez-Granda (Joint work with Emmanuel Candès) Structure and Randomness in System Identication and Learning, IPAM 1/17/2013 1/17/2013 1 / 44 Index 1 Motivation

More information

Sensing systems limited by constraints: physical size, time, cost, energy

Sensing systems limited by constraints: physical size, time, cost, energy Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original

More information

Minimizing Isotropic Total Variation without Subiterations

Minimizing Isotropic Total Variation without Subiterations MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Minimizing Isotropic Total Variation without Subiterations Kamilov, U. S. TR206-09 August 206 Abstract Total variation (TV) is one of the most

More information

Robust Sparse Recovery via Non-Convex Optimization

Robust Sparse Recovery via Non-Convex Optimization Robust Sparse Recovery via Non-Convex Optimization Laming Chen and Yuantao Gu Department of Electronic Engineering, Tsinghua University Homepage: http://gu.ee.tsinghua.edu.cn/ Email: gyt@tsinghua.edu.cn

More information