Sparse recovery for spherical harmonic expansions
|
|
- Clemence Shepherd
- 6 years ago
- Views:
Transcription
1 Rachel Ward 1 1 Courant Institute, New York University Workshop Sparsity and Cosmology, Nice May 31, 2011
2 Cosmic Microwave Background Radiation (CMB) map Temperature is measured as T (θ, ϕ) = k k=0 l= k a (l,k)yl k (θ, ϕ), where Y k l s are spherical harmonics Red band: measurements are corrupted by galactic signal
3 CMB map is compressible in spherical harmonics Consider the coefficient vector a = a (l,k) and T (θ, φ) n k k=0 l= k a (l,k) Yl k (θ, ϕ). This vector is predicted and observed to be compressible.
4 Spherical harmonics: Fourier analysis on the sphere Yl k s are products of complex exponentials and orthogonal Jacobi polynomials Yl k s are orthonormal with respect to spherical surface measure sin(ϕ)dϕdθ
5 CMB map inpainting via l 1 -minimization (Abrial, Moudden, Starck, Fadili, Delabrouille, Nguyen 08): Propose full-sky CMB map inpainting from partial measurements T (θ j, ϕ j ). Obtain coefficients a = a (l,k) by solving the l 1 -minimization problem: a = arg min c 1 N s.t. k k=0 l= k D = N is a prescribed maximal degree Theoretical justification? c (l,k) Y k l (θ j, ϕ j ) = T (θ j, ϕ j )
6 The spherical sampling matrix In matrix form, the constraints in l 1 -minimization problem are Φc = T, where Φ C m N is the spherical sampling matrix 1 Y1 1 (θ 1, ϕ 1 )... Yl k(θ 1, ϕ 1 )... 1 Y1 1 Φ = (θ 2, ϕ 2 )... Yl k(θ 2, ϕ 2 ) Y1 1(θ m, ϕ m )... Yl k(θ m, ϕ m )... We assume these measurements are underdetermined: m < N.
7 The spherical sampling matrix In matrix form, the constraints in l 1 -minimization problem are Φc = T, where Φ C m N is the spherical sampling matrix 1 Y1 1 (θ 1, ϕ 1 )... Yl k(θ 1, ϕ 1 )... 1 Y1 1 Φ = (θ 2, ϕ 2 )... Yl k(θ 2, ϕ 2 ) Y1 1(θ m, ϕ m )... Yl k(θ m, ϕ m )... We assume these measurements are underdetermined: m < N. Compressed sensing etc: If Φ acts as approximate isometry on sparse vectors, then compressible vectors are stably recovered via l 1 -minimization
8 Restricted Isometry Property (RIP) Definition [Candès, Romberg, Tao 06] The restricted isometry constant δ s of a matrix Φ C m N is the smallest number such that for all s-sparse x C N, (1 δ s ) x 2 2 Φx 2 2 (1 + δ s ) x 2 2
9 Restricted Isometry Property (RIP) Definition [Candès, Romberg, Tao 06] The restricted isometry constant δ s of a matrix Φ C m N is the smallest number such that for all s-sparse x C N, (1 δ s ) x 2 2 Φx 2 2 (1 + δ s ) x 2 2 Open to construct deterministic matrices satisfying the RIP in the regime m s log p (N). If Φ R m N has i.i.d. Gaussian or Bernoulli entries and m Cδ 2 (s log(n/s)) then δ s δ with high probability. [CRT 06, RV 08, R 09 ] If m = O(s log 4 (N)) the RIP holds w.h.p. for Φ associated to bounded orthonormal systems.
10 RIP matrices are good for sparse recovery [CRT 06, C 08, Foucart 10] If for Φ C m N we have δ s δ 0, (δ 0 =.46 is valid), y = Φx is observed, and then x = arg min z z 1 subject to Φz = y, x x 2 x x s 1 s, where x s is the best s-term approximation to x. If x is s-sparse, then x = x is recovered exactly. If x is well-approximated by an s-sparse vector, then x x.
11 Sparse recovery for bounded orthonormal systems Ψ = ψ 1 (x 1 ) ψ 2 (x 1 ) ψ N (x 1 )... ψ 1 (x m ) ψ 2 (x m ) ψ N (x m ) Suppose (ψ j ) N j=1 on compact domain D are orthonormal with respect to measure dν Suppose x 1,..., x m D are chosen i.i.d. from dν. Suppose max j 1...N ψ j K. Theorem (Rudelson, Vershynin 08, Rauhut 09) If m CK 2 δ 2 s log 3 (s) log(n) then the matrix 1 m Ψ satisfies δ s δ with probability at least 1 N γ log3 (s).
12 Examples of bounded orthonormal systems Fourier ψ j (x) = e 2πijx : D = [0, 1], dν = dx, K = 1 (also discrete analog) Chebyshev polynomials T j (x): D = [ 1, 1], dν = (1 x 2 ) 1/2 dx, K = 2 RIP for Ψ means that functions which admit s-sparse expansions with respect to the ψ j s can be recovered from their values at m sample points provided m CK 2 s log 3 (s) log(n), and functions with compressible expansions can be recovered approximately
13 Examples of bounded orthonormal systems [Rauhut, W 10] : preconditioned Legendre system Q(x)L j (x) L j s are normalized Legendre polynomials Q(x) = C(1 x 2 ) 1/4, dν(x) = π 1 (1 x 2 ) 1/2 dx, and K = 2 Q(x) is preconditioner; implies sparse recovery in Legendre system
14 Examples of bounded orthonormal systems [Rauhut,W 10] : More generally, preconditioned Jacobi system Q α (x)p α j (x) p α j s are polynomials orthonormal w.r.t. dν(x) = (1 x 2 ) α dx [Krasikov 07:] Q α p α j (x) Cα1/4 Q α (x) = (1 x 2 ) α/2+1/4, dν(x) = (1 x 2 ) 1/2 dx, and K = Cα 1/4 That is, Chebyshev sampling is universal for recovering sparse polynomial expansions
15 The spherical harmonics The spherical harmonics can be written as Yl k (θ, ϕ) = eikθ p k l k (cos ϕ)(sin ϕ) k, k l k, k 0 (θ, ϕ) [0, π] [0, 2π), Growth rates for complex exponentials and Jacobi polynomials give: sup 0 k N 1, k l k sin(ϕ) 1/2 Yl k (θ, ϕ) CN1/8 This implies the strategy of uniform sampling from the product measure dϕdθ.
16 Location of sampling points matters Figure: Phase transitions for sparse recovery on the sphere s/m s/m m/n (a) m/n (b) We form random s-sparse coefficient vectors c = (c l,k ) of degree D = N 1/2 = 16 and choose m sampling points from (a) product measure dϕdθ and (b) uniform spherical measure sin ϕdϕdθ. Black indicates recovery.
17 Sparse recovery in spherical harmonic systems Theorem (Rauhut, W 11) Suppose that (θ 1, ϕ 1 ),..., (θ m, ϕ m ), with m Cs log 3 (s)n 1/4 log(n) are drawn independently from the uniform measure on B = [0, π] [0, 2π). Let Φ be the m N spherical sampling matrix and let QΦ be its preconditioned version. With high probability the following holds for all harmonic polynomials g(θ, ϕ) = N 1/2 1 l l=0 k= l c l,kyl k (θ, ϕ). Suppose that noisy sample values y j = g(θ j, ϕ j ) + η j are observed, and that η ε. Let ĉ = arg min z 1 subject to QΦz Qy 2 mε. Then c ĉ 2 C 1σ s (c) 1 s + C 2 ε.
18 Conclusions Our results provide a measure of justification for good numerical results for CMB map inpainting via l 1 -minimization Our results may be of interest to other problems in geophysics, astronomy, and medical imaging.
19 Open problems For practical implementation, we would rather sample from a discrete grid. In experiments, the sparse recovery results for discrete vs. continuous are indistinguishable. Proof?
20 Open problems In our proof, we require m sn 1/4 log 4 (N) sampling points (or rows in Φ) for l 1 -minimization to be able to recover s-sparse spherical polynomials of degree N 1/2.. We should be able to improve this to m s log p (N)...
21 Open problems In our proof, we require m sn 1/4 log 4 (N) sampling points (or rows in Φ) for l 1 -minimization to be able to recover s-sparse spherical polynomials of degree N 1/2.. We should be able to improve this to m s log p (N)... In practice, different models of sparsity are more suited for the sphere, such as rotationally invariant sparsity sets, or sparsity in certain linear combinations of spherical harmonic coefficients
Sparse Legendre expansions via l 1 minimization
Sparse Legendre expansions via l 1 minimization Rachel Ward, Courant Institute, NYU Joint work with Holger Rauhut, Hausdorff Center for Mathematics, Bonn, Germany. June 8, 2010 Outline Sparse recovery
More informationInterpolation via weighted l 1 -minimization
Interpolation via weighted l 1 -minimization Holger Rauhut RWTH Aachen University Lehrstuhl C für Mathematik (Analysis) Mathematical Analysis and Applications Workshop in honor of Rupert Lasser Helmholtz
More informationInterpolation via weighted l 1 -minimization
Interpolation via weighted l 1 -minimization Holger Rauhut RWTH Aachen University Lehrstuhl C für Mathematik (Analysis) Matheon Workshop Compressive Sensing and Its Applications TU Berlin, December 11,
More informationRecovering overcomplete sparse representations from structured sensing
Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix
More informationInterpolation via weighted l 1 minimization
Interpolation via weighted l 1 minimization Rachel Ward University of Texas at Austin December 12, 2014 Joint work with Holger Rauhut (Aachen University) Function interpolation Given a function f : D C
More informationAN INTRODUCTION TO COMPRESSIVE SENSING
AN INTRODUCTION TO COMPRESSIVE SENSING Rodrigo B. Platte School of Mathematical and Statistical Sciences APM/EEE598 Reverse Engineering of Complex Dynamical Networks OUTLINE 1 INTRODUCTION 2 INCOHERENCE
More informationExponential decay of reconstruction error from binary measurements of sparse signals
Exponential decay of reconstruction error from binary measurements of sparse signals Deanna Needell Joint work with R. Baraniuk, S. Foucart, Y. Plan, and M. Wootters Outline Introduction Mathematical Formulation
More informationCoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp
CoSaMP Iterative signal recovery from incomplete and inaccurate samples Joel A. Tropp Applied & Computational Mathematics California Institute of Technology jtropp@acm.caltech.edu Joint with D. Needell
More informationCoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles
CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles SIAM Student Research Conference Deanna Needell Joint work with Roman Vershynin and Joel Tropp UC Davis, May 2008 CoSaMP: Greedy Signal
More informationStrengthened Sobolev inequalities for a random subspace of functions
Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)
More informationGREEDY SIGNAL RECOVERY REVIEW
GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin
More informationGreedy Signal Recovery and Uniform Uncertainty Principles
Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles
More informationNoisy Signal Recovery via Iterative Reweighted L1-Minimization
Noisy Signal Recovery via Iterative Reweighted L1-Minimization Deanna Needell UC Davis / Stanford University Asilomar SSC, November 2009 Problem Background Setup 1 Suppose x is an unknown signal in R d.
More informationInterpolation via weighted l 1 minimization
Interpolation via weighted l minimization Holger Rauhut, Rachel Ward August 3, 23 Abstract Functions of interest are often smooth and sparse in some sense, and both priors should be taken into account
More informationMultiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing
Multiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing ibeltal F. Alem, Daniel H. Chae, and Rodney A. Kennedy The Australian National
More informationReconstruction from Anisotropic Random Measurements
Reconstruction from Anisotropic Random Measurements Mark Rudelson and Shuheng Zhou The University of Michigan, Ann Arbor Coding, Complexity, and Sparsity Workshop, 013 Ann Arbor, Michigan August 7, 013
More informationLecture Notes 9: Constrained Optimization
Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form
More informationSparse and Low Rank Recovery via Null Space Properties
Sparse and Low Rank Recovery via Null Space Properties Holger Rauhut Lehrstuhl C für Mathematik (Analysis), RWTH Aachen Convexity, probability and discrete structures, a geometric viewpoint Marne-la-Vallée,
More informationInterpolation-Based Trust-Region Methods for DFO
Interpolation-Based Trust-Region Methods for DFO Luis Nunes Vicente University of Coimbra (joint work with A. Bandeira, A. R. Conn, S. Gratton, and K. Scheinberg) July 27, 2010 ICCOPT, Santiago http//www.mat.uc.pt/~lnv
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationUniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit
Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit arxiv:0707.4203v2 [math.na] 14 Aug 2007 Deanna Needell Department of Mathematics University of California,
More informationLecture 22: More On Compressed Sensing
Lecture 22: More On Compressed Sensing Scribed by Eric Lee, Chengrun Yang, and Sebastian Ament Nov. 2, 207 Recap and Introduction Basis pursuit was the method of recovering the sparsest solution to an
More informationCompressed Sensing and Sparse Recovery
ELE 538B: Sparsity, Structure and Inference Compressed Sensing and Sparse Recovery Yuxin Chen Princeton University, Spring 217 Outline Restricted isometry property (RIP) A RIPless theory Compressed sensing
More informationAn Introduction to Sparse Approximation
An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,
More informationSolution Recovery via L1 minimization: What are possible and Why?
Solution Recovery via L1 minimization: What are possible and Why? Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Eighth US-Mexico Workshop on Optimization
More informationRapidly Computing Sparse Chebyshev and Legendre Coefficient Expansions via SFTs
Rapidly Computing Sparse Chebyshev and Legendre Coefficient Expansions via SFTs Mark Iwen Michigan State University October 18, 2014 Work with Janice (Xianfeng) Hu Graduating in May! M.A. Iwen (MSU) Fast
More information6 Compressed Sensing and Sparse Recovery
6 Compressed Sensing and Sparse Recovery Most of us have noticed how saving an image in JPEG dramatically reduces the space it occupies in our hard drives as oppose to file types that save the pixel value
More informationThresholds for the Recovery of Sparse Solutions via L1 Minimization
Thresholds for the Recovery of Sparse Solutions via L Minimization David L. Donoho Department of Statistics Stanford University 39 Serra Mall, Sequoia Hall Stanford, CA 9435-465 Email: donoho@stanford.edu
More informationConstructing Explicit RIP Matrices and the Square-Root Bottleneck
Constructing Explicit RIP Matrices and the Square-Root Bottleneck Ryan Cinoman July 18, 2018 Ryan Cinoman Constructing Explicit RIP Matrices July 18, 2018 1 / 36 Outline 1 Introduction 2 Restricted Isometry
More informationRandom projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016
Lecture notes 5 February 9, 016 1 Introduction Random projections Random projections are a useful tool in the analysis and processing of high-dimensional data. We will analyze two applications that use
More informationCombining geometry and combinatorics
Combining geometry and combinatorics A unified approach to sparse signal recovery Anna C. Gilbert University of Michigan joint work with R. Berinde (MIT), P. Indyk (MIT), H. Karloff (AT&T), M. Strauss
More informationSparse Optimization Lecture: Sparse Recovery Guarantees
Those who complete this lecture will know Sparse Optimization Lecture: Sparse Recovery Guarantees Sparse Optimization Lecture: Sparse Recovery Guarantees Instructor: Wotao Yin Department of Mathematics,
More informationThe uniform uncertainty principle and compressed sensing Harmonic analysis and related topics, Seville December 5, 2008
The uniform uncertainty principle and compressed sensing Harmonic analysis and related topics, Seville December 5, 2008 Emmanuel Candés (Caltech), Terence Tao (UCLA) 1 Uncertainty principles A basic principle
More informationSensing systems limited by constraints: physical size, time, cost, energy
Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original
More informationIntroduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011
Compressed Sensing Huichao Xue CS3750 Fall 2011 Table of Contents Introduction From News Reports Abstract Definition How it works A review of L 1 norm The Algorithm Backgrounds for underdetermined linear
More informationSparse Approximation of PDEs based on Compressed Sensing
Sparse Approximation of PDEs based on Compressed Sensing Simone Brugiapaglia Department of Mathematics Simon Fraser University Retreat for Young Researchers in Stochastics September 24, 26 2 Introduction
More informationSparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery
Sparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery Anna C. Gilbert Department of Mathematics University of Michigan Sparse signal recovery measurements:
More informationSparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery
Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery Anna C. Gilbert Department of Mathematics University of Michigan Connection between... Sparse Approximation and Compressed
More informationMethods for sparse analysis of high-dimensional data, II
Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional
More informationIntroduction to Compressed Sensing
Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral
More informationConcentration Inequalities for Random Matrices
Concentration Inequalities for Random Matrices M. Ledoux Institut de Mathématiques de Toulouse, France exponential tail inequalities classical theme in probability and statistics quantify the asymptotic
More informationUpper Bound for Intermediate Singular Values of Random Sub-Gaussian Matrices 1
Upper Bound for Intermediate Singular Values of Random Sub-Gaussian Matrices 1 Feng Wei 2 University of Michigan July 29, 2016 1 This presentation is based a project under the supervision of M. Rudelson.
More informationSparse Recovery with Pre-Gaussian Random Matrices
Sparse Recovery with Pre-Gaussian Random Matrices Simon Foucart Laboratoire Jacques-Louis Lions Université Pierre et Marie Curie Paris, 75013, France Ming-Jun Lai Department of Mathematics University of
More informationCompressed Sensing: Lecture I. Ronald DeVore
Compressed Sensing: Lecture I Ronald DeVore Motivation Compressed Sensing is a new paradigm for signal/image/function acquisition Motivation Compressed Sensing is a new paradigm for signal/image/function
More informationRui ZHANG Song LI. Department of Mathematics, Zhejiang University, Hangzhou , P. R. China
Acta Mathematica Sinica, English Series May, 015, Vol. 31, No. 5, pp. 755 766 Published online: April 15, 015 DOI: 10.1007/s10114-015-434-4 Http://www.ActaMath.com Acta Mathematica Sinica, English Series
More informationCompressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles
Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional
More informationSparse Astronomical Data Analysis. Jean-Luc Starck. Collaborators: J. Bobin., F. Sureau. CEA Saclay
Sparse Astronomical Data Analysis Jean-Luc Starck Collaborators: J. Bobin., F. Sureau CEA Saclay What is a good representation for data? A signal s (n samples) can be represented as sum of weighted elements
More informationZ Algorithmic Superpower Randomization October 15th, Lecture 12
15.859-Z Algorithmic Superpower Randomization October 15th, 014 Lecture 1 Lecturer: Bernhard Haeupler Scribe: Goran Žužić Today s lecture is about finding sparse solutions to linear systems. The problem
More informationMIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design
MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications Class 19: Data Representation by Design What is data representation? Let X be a data-space X M (M) F (M) X A data representation
More informationINDUSTRIAL MATHEMATICS INSTITUTE. B.S. Kashin and V.N. Temlyakov. IMI Preprint Series. Department of Mathematics University of South Carolina
INDUSTRIAL MATHEMATICS INSTITUTE 2007:08 A remark on compressed sensing B.S. Kashin and V.N. Temlyakov IMI Preprint Series Department of Mathematics University of South Carolina A remark on compressed
More informationStability and robustness of l 1 -minimizations with Weibull matrices and redundant dictionaries
Stability and robustness of l 1 -minimizations with Weibull matrices and redundant dictionaries Simon Foucart, Drexel University Abstract We investigate the recovery of almost s-sparse vectors x C N from
More informationLecture 3. Random Fourier measurements
Lecture 3. Random Fourier measurements 1 Sampling from Fourier matrices 2 Law of Large Numbers and its operator-valued versions 3 Frames. Rudelson s Selection Theorem Sampling from Fourier matrices Our
More informationMethods for sparse analysis of high-dimensional data, II
Methods for sparse analysis of high-dimensional data, II Rachel Ward May 26, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 55 High dimensional
More informationCompressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery
Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Jorge F. Silva and Eduardo Pavez Department of Electrical Engineering Information and Decision Systems Group Universidad
More informationUniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit
Claremont Colleges Scholarship @ Claremont CMC Faculty Publications and Research CMC Faculty Scholarship 6-5-2008 Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit
More informationThe Analysis Cosparse Model for Signals and Images
The Analysis Cosparse Model for Signals and Images Raja Giryes Computer Science Department, Technion. The research leading to these results has received funding from the European Research Council under
More informationCompressive sensing of low-complexity signals: theory, algorithms and extensions
Compressive sensing of low-complexity signals: theory, algorithms and extensions Laurent Jacques March 7, 9, 1, 14, 16 and 18, 216 9h3-12h3 (incl. 3 ) Graduate School in Systems, Optimization, Control
More informationReconstruction of sparse Legendre and Gegenbauer expansions
Reconstruction of sparse Legendre and Gegenbauer expansions Daniel Potts Manfred Tasche We present a new deterministic algorithm for the reconstruction of sparse Legendre expansions from a small number
More informationSigma Delta Quantization for Compressed Sensing
Sigma Delta Quantization for Compressed Sensing C. Sinan Güntürk, 1 Mark Lammers, 2 Alex Powell, 3 Rayan Saab, 4 Özgür Yılmaz 4 1 Courant Institute of Mathematical Sciences, New York University, NY, USA.
More informationCompressed Sensing and Redundant Dictionaries
Compressed Sensing and Redundant Dictionaries Holger Rauhut, Karin Schnass and Pierre Vandergheynst December 2, 2006 Abstract This article extends the concept of compressed sensing to signals that are
More informationCOMPRESSED SENSING IN PYTHON
COMPRESSED SENSING IN PYTHON Sercan Yıldız syildiz@samsi.info February 27, 2017 OUTLINE A BRIEF INTRODUCTION TO COMPRESSED SENSING A BRIEF INTRODUCTION TO CVXOPT EXAMPLES A Brief Introduction to Compressed
More informationRandom hyperplane tessellations and dimension reduction
Random hyperplane tessellations and dimension reduction Roman Vershynin University of Michigan, Department of Mathematics Phenomena in high dimensions in geometric analysis, random matrices and computational
More informationSolving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming)
Solving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming) Justin Romberg Georgia Tech, ECE Caltech ROM-GR Workshop June 7, 2013 Pasadena, California Linear
More informationClass notes: Approximation
Class notes: Approximation Introduction Vector spaces, linear independence, subspace The goal of Numerical Analysis is to compute approximations We want to approximate eg numbers in R or C vectors in R
More informationCompressed Sensing and Neural Networks
and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications
More informationarxiv: v3 [math.na] 7 Dec 2018
A data-driven framework for sparsity-enhanced surrogates with arbitrary mutually dependent randomness Huan Lei, 1, Jing Li, 1, Peiyuan Gao, 1 Panos Stinis, 1, 2 1, 3, and Nathan A. Baker 1 Pacific Northwest
More informationRecent Developments in Compressed Sensing
Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline
More informationCS on CS: Computer Science insights into Compresive Sensing (and vice versa) Piotr Indyk MIT
CS on CS: Computer Science insights into Compresive Sensing (and vice versa) Piotr Indyk MIT Sparse Approximations Goal: approximate a highdimensional vector x by x that is sparse, i.e., has few nonzero
More information18.S096: Compressed Sensing and Sparse Recovery
18.S096: Compressed Sensing and Sparse Recovery Topics in Mathematics of Data Science Fall 2015) Afonso S. Bandeira bandeira@mit.edu http://math.mit.edu/~bandeira November 10, 2015 These are lecture notes
More informationOn the coherence barrier and analogue problems in compressed sensing
On the coherence barrier and analogue problems in compressed sensing Clarice Poon University of Cambridge June 1, 2017 Joint work with: Ben Adcock Anders Hansen Bogdan Roman (Simon Fraser) (Cambridge)
More informationPre-weighted Matching Pursuit Algorithms for Sparse Recovery
Journal of Information & Computational Science 11:9 (214) 2933 2939 June 1, 214 Available at http://www.joics.com Pre-weighted Matching Pursuit Algorithms for Sparse Recovery Jingfei He, Guiling Sun, Jie
More informationSparsity Regularization
Sparsity Regularization Bangti Jin Course Inverse Problems & Imaging 1 / 41 Outline 1 Motivation: sparsity? 2 Mathematical preliminaries 3 l 1 solvers 2 / 41 problem setup finite-dimensional formulation
More informationCompressed Sensing and Redundant Dictionaries
Compressed Sensing and Redundant Dictionaries Holger Rauhut, Karin Schnass and Pierre Vandergheynst Abstract This article extends the concept of compressed sensing to signals that are not sparse in an
More informationNear Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing
Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas M.Vidyasagar@utdallas.edu www.utdallas.edu/ m.vidyasagar
More informationConstrained optimization
Constrained optimization DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Compressed sensing Convex constrained
More informationIEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER
IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER 2015 1239 Preconditioning for Underdetermined Linear Systems with Sparse Solutions Evaggelia Tsiligianni, StudentMember,IEEE, Lisimachos P. Kondi,
More informationSignal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit
Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit Deanna Needell and Roman Vershynin Abstract We demonstrate a simple greedy algorithm that can reliably
More informationCompressed Sensing: Extending CLEAN and NNLS
Compressed Sensing: Extending CLEAN and NNLS Ludwig Schwardt SKA South Africa (KAT Project) Calibration & Imaging Workshop Socorro, NM, USA 31 March 2009 Outline 1 Compressed Sensing (CS) Introduction
More informationPart IV Compressed Sensing
Aisenstadt Chair Course CRM September 2009 Part IV Compressed Sensing Stéphane Mallat Centre de Mathématiques Appliquées Ecole Polytechnique Conclusion to Super-Resolution Sparse super-resolution is sometime
More informationCompressive Sensing and Beyond
Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered
More informationTHEORY OF COMPRESSIVE SENSING VIA l 1 -MINIMIZATION: A NON-RIP ANALYSIS AND EXTENSIONS
THEORY OF COMPRESSIVE SENSING VIA l 1 -MINIMIZATION: A NON-RIP ANALYSIS AND EXTENSIONS YIN ZHANG Abstract. Compressive sensing (CS) is an emerging methodology in computational signal processing that has
More informationKomprimované snímání a LASSO jako metody zpracování vysocedimenzionálních dat
Komprimované snímání a jako metody zpracování vysocedimenzionálních dat Jan Vybíral (Charles University Prague, Czech Republic) November 2014 VUT Brno 1 / 49 Definition and motivation Its use in bioinformatics
More informationSparse solutions of underdetermined systems
Sparse solutions of underdetermined systems I-Liang Chern September 22, 2016 1 / 16 Outline Sparsity and Compressibility: the concept for measuring sparsity and compressibility of data Minimum measurements
More informationQuantifying conformation fluctuation induced uncertainty in bio-molecular systems
Quantifying conformation fluctuation induced uncertainty in bio-molecular systems Guang Lin, Dept. of Mathematics & School of Mechanical Engineering, Purdue University Collaborative work with Huan Lei,
More informationsample lectures: Compressed Sensing, Random Sampling I & II
P. Jung, CommIT, TU-Berlin]: sample lectures Compressed Sensing / Random Sampling I & II 1/68 sample lectures: Compressed Sensing, Random Sampling I & II Peter Jung Communications and Information Theory
More informationOn the recovery of measures without separation conditions
Norbert Wiener Center Department of Mathematics University of Maryland, College Park http://www.norbertwiener.umd.edu Applied and Computational Mathematics Seminar Georgia Institute of Technology October
More informationCompressive Sensing Theory and L1-Related Optimization Algorithms
Compressive Sensing Theory and L1-Related Optimization Algorithms Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, USA CAAM Colloquium January 26, 2009 Outline:
More informationLeast singular value of random matrices. Lewis Memorial Lecture / DIMACS minicourse March 18, Terence Tao (UCLA)
Least singular value of random matrices Lewis Memorial Lecture / DIMACS minicourse March 18, 2008 Terence Tao (UCLA) 1 Extreme singular values Let M = (a ij ) 1 i n;1 j m be a square or rectangular matrix
More informationStochastic geometry and random matrix theory in CS
Stochastic geometry and random matrix theory in CS IPAM: numerical methods for continuous optimization University of Edinburgh Joint with Bah, Blanchard, Cartis, and Donoho Encoder Decoder pair - Encoder/Decoder
More informationCOMPRESSED Sensing (CS) is a method to recover a
1 Sample Complexity of Total Variation Minimization Sajad Daei, Farzan Haddadi, Arash Amini Abstract This work considers the use of Total Variation (TV) minimization in the recovery of a given gradient
More informationGeneralized Orthogonal Matching Pursuit- A Review and Some
Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents
More informationInvertibility of random matrices
University of Michigan February 2011, Princeton University Origins of Random Matrix Theory Statistics (Wishart matrices) PCA of a multivariate Gaussian distribution. [Gaël Varoquaux s blog gael-varoquaux.info]
More informationPhase Transition Phenomenon in Sparse Approximation
Phase Transition Phenomenon in Sparse Approximation University of Utah/Edinburgh L1 Approximation: May 17 st 2008 Convex polytopes Counting faces Sparse Representations via l 1 Regularization Underdetermined
More informationA Few Basic Problems in Compressed Sensing
A Few Basic Problems in Compressed Sensing Song Li School of Mathematical Sciences Zhejiang University songli@zju.edu.cn South China Normal University,May 19,2017 Outline Brief Introduction Sparse Representation
More informationPrimal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector
Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Muhammad Salman Asif Thesis Committee: Justin Romberg (Advisor), James McClellan, Russell Mersereau School of Electrical and Computer
More informationCompressed Sensing and Linear Codes over Real Numbers
Compressed Sensing and Linear Codes over Real Numbers Henry D. Pfister (joint with Fan Zhang) Texas A&M University College Station Information Theory and Applications Workshop UC San Diego January 31st,
More informationCompressed Sensing and Robust Recovery of Low Rank Matrices
Compressed Sensing and Robust Recovery of Low Rank Matrices M. Fazel, E. Candès, B. Recht, P. Parrilo Electrical Engineering, University of Washington Applied and Computational Mathematics Dept., Caltech
More informationRandom Coding for Fast Forward Modeling
Random Coding for Fast Forward Modeling Justin Romberg with William Mantzel, Salman Asif, Karim Sabra, Ramesh Neelamani Georgia Tech, School of ECE Workshop on Sparsity and Computation June 11, 2010 Bonn,
More informationA new method on deterministic construction of the measurement matrix in compressed sensing
A new method on deterministic construction of the measurement matrix in compressed sensing Qun Mo 1 arxiv:1503.01250v1 [cs.it] 4 Mar 2015 Abstract Construction on the measurement matrix A is a central
More informationECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis
ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear
More information