Going off the grid. Benjamin Recht University of California, Berkeley. Joint work with Badri Bhaskar Parikshit Shah Gonnguo Tang
|
|
- John Nelson
- 5 years ago
- Views:
Transcription
1 Going off the grid Benjamin Recht University of California, Berkeley Joint work with Badri Bhaskar Parikshit Shah Gonnguo Tang
2 imaging astronomy seismology spectroscopy x(t) = kx j= c j e i2 f jt DOA Estimation Sampling GPS Radar Ultrasound
3 imaging astronomy seismology spectroscopy x(t) = kx c j g(t j ) j= DOA Estimation Sampling GPS Radar Ultrasound
4 imaging astronomy seismology spectroscopy x(t) = kx j= c j g(t j )e i2 f jt DOA Estimation Sampling GPS Radar Ultrasound
5 imaging astronomy seismology spectroscopy x(s) =PSF~ 8 < : kx j= 9 = c j (s s j ) ; DOA Estimation Sampling GPS Radar Ultrasound
6 imaging astronomy seismology spectroscopy ˆx() = d PSF() kx c j e i2 s j j= DOA Estimation Sampling GPS Radar Ultrasound
7 Observe a sparse combination of sinusoids sx x m = c k e i2 mu k for some u k 2 [, ) k= Spectrum Estimation: find a combination of sinusoids agreeing with time series data Classic (79...): Prony s method Assume coefficients are positive for simplicity r k= c k e i2 u k e i4 u k e i6 u k e i2 u k e i4 u k e i6 u k = x x x 2 x 3 x x x x 2 x 2 x x x x 3 x 2 x x =: toep(x) toep(x) is positive semidefinite, and any null vector corresponds to a polynomial that vanishes at MUSIC, ESPRIT, Cadzow, etc. e i2 u k
8 Observe a sparse combination of sinusoids sx x m = c k e i2 mu k for some u k 2 [, ) k= Spectrum Estimation: find a combination of sinusoids agreeing with time series data sparse Contemporary: x Fc n N F ab =exp(i2 ab/n) Solve with LASSO: minimize x Fc µ c
9 Observe a sparse combination of sinusoids sx x m = c k e i2 mu k for some u k 2 [, ) k= Spectrum Estimation: find a combination of sinusoids agreeing with time series data Classic SVD grid free need to know model order lack of quantitative theory unstable in practice Contemporary gridding+l minimization robust model selection quantitative theory discretization error basis mismatch numerical instability Can we bridge the gap?
10 Parsimonious Models rank model weights atoms Search for best linear combination of fewest atoms rank = fewest atoms needed to describe the model
11 Atomic Norms Given a basic set of atoms,, define the function kxk A =inf{t > : x 2 tconv(a)} A When is centrosymmetric, we get a norm A kxk A =inf{ X a2a c a : x = X a2a c a a} IDEA: When does this work? minimize subject to kzk A z = y
12 Atomic Norm Minimization IDEA: minimize subject to kzk A z = y Generalizes existing, powerful methods Rigorous formula for developing new analysis algorithms Precise, tight bounds on number of measurements needed for model recovery One algorithm prototype for a myriad of dataanalysis applications Chandrasekaran, R, Parrilo, and Willsky
13 Spectrum Estimation Observe a sparse combination of sinusoids sx x? m = c? ke i2 mu? k for some u? k 2 [, ) k= Observe: y = x? + (signal plus noise) Atomic Set 82 >< A = 6 4 >: e e e e i i2 +i i4 +i i2 n+i : 2 [, 2 ), 2 [, ) 9 >= >; Classical techniques (Prony, Matrix Pencil, MUSIC, ESPRIT, Cadzow), use the fact that noiseless moment matrices are low-rank: rx k= k e k i e 2 ki e 3 ki e k i e 2 ki e 3 ki = µ µ µ 2 µ 3 µ µ µ µ 2 µ 2 µ µ µ µ 3 µ 2 µ µ 3 7 5
14 Atomic Norm for Spectrum Estimation IDEA: minimize kzk A subject to k z yk apple Atomic Set: 82 >< A = 6 4 >: e e e e i i2 +i i4 +i. i2 n+i : 2 [, 2 ), 2 [, ) 9 >= >; How do we solve the optimization problem? Can we approximate the true signal from partial and noisy measurements? Can we estimate the frequencies from partial and noisy measurements?
15 Which atomic norm for sinusoids? x m = sx c k e i2 mu k for some u k 2 [, ) k= When the rx k= c k e i2 u k e i4 u k e i6 u k c k are positive e i2 u k e i4 u k e i6 u k = x x x 2 x 3 x x x x 2 x 2 x x x x 3 x 2 x x For general coefficients, the convex hull is characterized by linear matrix inequalities (Toeplitz positive semidefinite) kxk A =inf 2 t + 2 w : apple t x x toep(w) Moment Curve of [t,t 2,t 3,t 4,...], t 2 S
16 A A [ { A} conv(a) conv(a [ { A})
17 cos(2 u k)/2 + cos(2 u 2 k)/2 cos(2 u k)/2 cos(2 u 2 k)/2 u =.43 u 2 =.4
18 Nearly optimal rates Observe a sparse combination of sinusoids sx x? m = c? ke i2 mu? k for some u? k 2 [, ) k= Observe: y = x? + Assume frequencies are far apart: (signal plus noise) min p6=q d(u p,u q ) 4 n Solve: ˆx = arg min x 2 kx yk2 2 + µkxk A Error Rate: n kˆx x? k 2 2 apple C 2 s log(n) n No algorithm can do better than apple E n kˆx x? k 2 C 2 s log(n/s) 2 n even if the frequencies are wellseparated No algorithm can do better than n kˆx x? k 2 2 C n 2 s even if we knew all of the frequencies (uk * )
19 Mean Square Error Performance Profile P ( ) Frequencies generated at random with /n separation. Random phases, fading amplitudes. Cadzow and MUSIC provided model order. AST (Atomic Norm Soft Thresholding) and LASSO estimate noise power. Lower is better Performance profile over all parameter values and settings For algorithm s: P s ( )= # {p 2 P : MSE s(p) apple #(P) Higher is better min s MSE s (p)}
20 Extracting the decomposition How do you extract the frequencies? Look at the dual norm: kvk A = maxha, vi = max a2a u2[,) Dual norm is the maximum modulus of a polynomial nx k= v k e 2 iku At optimality, maximum modulus is attained at the support of the signal works way better than Prony interpolation in practice
21 Aside: the proof How do you prove any compressed sensing-esque result? Look at the dual norm: kvk A = maxha, vi = max a2a u2[,) nx k= v k e 2 iku Generic proof: Construct a polynomial such that the atoms you want maximize the polynomial. Then prove that the polynomial is bounded everywhere else.
22 Localization Guarantees Dual Polynomial Magnitude Estimated Frequency.6/n True Frequency Spurious Frequency Frequency Spurious Amplitudes l: ˆf l F ĉ l C k 2 (n) n Weighted Frequency Deviation l: ˆf l N j ĉ l f j T d(f j, ˆf l ) 2 C 2 k 2 (n) n Near region approximation c j l: ˆfl Nj ĉ l C 3 k 2 (n) n
23 Frequency Localization Spurious Amplitudes Weighted Frequency Deviation Near region approximation l: ˆf l F ĉ l C k 2 (n) n l: ˆf l N j ĉ l f j T d(f j, ˆf l ) 2 C 2 k 2 (n) n c j l: ˆfl Nj ĉ l C 3 k 2 (n) n Ps(β).6.4 Ps(β).6.4 Ps(β) AST MUSIC Cadzow β AST MUSIC Cadzow AST MUSIC Cadzow β AST MUSIC Cadzow AST MUSIC Cadzow β AST MUSIC Cadzow m 6 4 k = 32 m2.2 k = 32 m3 3 2 k = SNR (db) SNR (db) SNR (db)
24 Incomplete Data/Random Sampling Observe a random subset of samples On the grid, Candes, Romberg & Tao (24) Off the grid, new, Compressed Sensing extended to the wide continuous domain Recover the missing part by solving T {,,...,n } minimize kzk A z subject to z T = x T. Extract frequencies from the dual optimal solution Full observation Random sampling
25 Exact Recovery (off the grid) { Δ sx x m = c k exp(2 imu k ) k= WANT {u k,c k } Theorem: (Candès and Fernandez-Granda 22) A line spectrum with minimum frequency separation Δ > 4/s can be recovered from the first 2s Fourier coefficients via atomic norm minimization. Theorem: (Tang, Bhaskar, Shah, and R. 22) A line spectrum with minimum frequency separation Δ > 4/n can be recovered from most subsets of the first n Fourier coefficients of size at least O(s log(s) log(n)). s random samples are better than s equispaced samples. On a grid, this is just compressed sensing Off the grid, this is new See [BCGLS2] for a sketching approach to a similar problem No balancing of coherence as n grows.
26 Mean Square Error Performance Profile P ( ) SDP BP:4x BP:64x
27 Discretization Discretize the parameter space to get a finite dictionary x(t) =? NX l= c l e i2 f lt,f l 2 grids Wakin, Becker, et.al (22), Fannjiang, Strohmer &Yan (2), Bajwa, Haupt, Sayeed & Nowak (2), Tropp, Laska, Duarte, Romberg & Baraniuk (2), Herman & Strohmer (29), Malioutov, Cetin & Willsky (25), Candes, Romberg & Tao (24) Basis mismatch Chi, Scharf, Pezeshki & Calderbank (2), Herman & Strohmer (2)
28 Approximation through discretization Relaxations: A A 2 =) kxk A2 applekxk A T T Let be a finite net. T T ( X Let be a matrix whose columns are the set ) A kxk A =inf c k : x = c k ` + extra equality constraints When polynomials of bounded degree are Lipschitz on T, there is a lambda in (,] such that kxk A applekxk A applekxk A
29 Approximation through discretization Relaxations: A A 2 =) kxk A2 applekxk A T T Let be a finite net. T T ( X Let be a matrix whose columns are the set ) A kxk A =inf c k : x = c k ` + extra equality constraints When polynomials of bounded degree are Lipschitz on T, there is a lambda in (,] such that kxk A applekxk A applekxk A
30 Discretization for sinusoids Discretized Atomic Set: A N = 82 >< 6 4 >: e e e e i i2 +i i4 +i i2 n+i : = 2 k N, k =,...,N, 2 [, ) 9 >= >; kxk AN =inf ( N X k= c k : x = F [N,n]c ) F [N,n] = First n rows of N x N DFT Matrix 2 n N kxk A N applekxk A applekxk AN Approximation fast to compute using first-order methods. Approximation only improves as N improves. Allows you to estimate the signal, but you can t recover the exact frequencies.
31 Discretization Discretize the parameter space to get a finite number of grid points Enforce finite number of constraints: h z,a( j )i apple, j 2 m Equivalently, in the primal replace the atomic norm with a discrete one What happens to the solutions when
32 Convergence in Dual Assumption: there exist parameters such that are linearly independent Enforce finite constraints in the dual: h z,a( j )i apple, j 2 m Theorem: The discretized optimal objectives converge to the original objective Any solution sequence {ẑ m } of the discretized problems has a subsequence that converges to the solution set of the original problem For the LASSO dual, the convergence speed is O( m ) m = m = m = log.5 2 (m) log 2 ( fm f ) log 2 ( ẑm ẑ )
33 Convergence in Primal The original primal solution is associated with a measure ˆµ : ˆx = R a()ˆµ(d), kˆxk A = kˆµk TV The discretized primal solutions ˆx m are associated with measure ˆµ that.5. m are supported only on m Theorem The discretized optimal measures converge in distribution to an original optimal measure Fixed a neighborhood of the support of an original optimal measure, the supports of the discretized optimal measures will eventually converge to the neighborhood ˆx.5.5 m = 64 m = m =
34 Single Molecule Imaging Courtesy of Zhuang Research Lab
35 Single Molecule Imaging Bundles of 8 tubes of 3 nm diameter Sparse density: 849 molecules on 2 frames Resolution: 64x64 pixels Pixel size: nmxnm Field of view: 64nmx64nm Target resolution: nmxnm Discretize the FOV into 64x64 pixels I(x, y) = X j c j PSF(x x j,y y j ) (x j,y j ) 2 [, 64] 2 (x, y) 2 {5, 5,...,635}
36 Single Molecule Imaging
37 Single Molecule Imaging.8.8 Precision.6.4 Recall Sparse CoG quickpalm 2 Radius Sparse CoG quickpalm 2 Radius Jaccard 6 4 F score Sparse CoG quickpalm Radius.2 Sparse CoG quickpalm Radius
38 Summary and Future ` Work Unified approach for continuous problems in estimation. Obviates incoherence and basis mismatch New insight into bounds on estimation error and exact recovery through convex duality and algebraic geometry State-of-the-art results in classic fields of spectrum estimation and system identification Recovery of general sparse signal trains More general moment problems in signal processing
39 Acknowledgements Work developed with Venkat Chandrasekaran, Babak Hassibi (Caltech), Weiyu Xu (Iowa), Pablo A. Parrilo, Alan Willsky (MIT), Maryam Fazel (Washington) Badri Bhaskar, Rob Nowak, Nikhil Rao, Gongguo Tang (Wisconsin). For all references, see:
40 References Atomic norm denoising with applications to line spectral estimation. Badri Narayan Bhaskar, Gongguo Tang, and Benjamin Recht. IEEE Transactions on Signal Processing. Vol 6, no 23, pages Compressed sensing off the grid. Gongguo Tang, Badri Bhaskar, Parikshit Shah, and Benjamin Recht. IEEE Transactions on Information Theory. Vol 59, no, pages Near Minimax Line Spectral Estimation. Gongguo Tang, Badri Narayan Bhaskar, and Benjamin Recht. Submitted to IEEE Transactions on Information Theory, 23. The convex geometry of inverse problems. Venkat Chandrasekaran, Benjamin Recht, Pablo Parrilo, and Alan Willsky. Foundations on Computational Mathematics. Vol. 2, no 6, pages All references can be found at
Going off the grid. Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison
Going off the grid Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison Joint work with Badri Bhaskar Parikshit Shah Gonnguo Tang We live in a continuous world... But we work
More informationCombining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation
UIUC CSL Mar. 24 Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation Yuejie Chi Department of ECE and BMI Ohio State University Joint work with Yuxin Chen (Stanford).
More informationSuper-resolution via Convex Programming
Super-resolution via Convex Programming Carlos Fernandez-Granda (Joint work with Emmanuel Candès) Structure and Randomness in System Identication and Learning, IPAM 1/17/2013 1/17/2013 1 / 44 Index 1 Motivation
More informationTowards a Mathematical Theory of Super-resolution
Towards a Mathematical Theory of Super-resolution Carlos Fernandez-Granda www.stanford.edu/~cfgranda/ Information Theory Forum, Information Systems Laboratory, Stanford 10/18/2013 Acknowledgements This
More informationROBUST BLIND SPIKES DECONVOLUTION. Yuejie Chi. Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 43210
ROBUST BLIND SPIKES DECONVOLUTION Yuejie Chi Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 4 ABSTRACT Blind spikes deconvolution, or blind super-resolution, deals with
More informationarxiv: v2 [cs.it] 16 Feb 2013
Atomic norm denoising with applications to line spectral estimation Badri Narayan Bhaskar, Gongguo Tang, and Benjamin Recht Department of Electrical and Computer Engineering Department of Computer Sciences
More informationSparse Parameter Estimation: Compressed Sensing meets Matrix Pencil
Sparse Parameter Estimation: Compressed Sensing meets Matrix Pencil Yuejie Chi Departments of ECE and BMI The Ohio State University Colorado School of Mines December 9, 24 Page Acknowledgement Joint work
More informationParameter Estimation for Mixture Models via Convex Optimization
Parameter Estimation for Mixture Models via Convex Optimization Yuanxin Li Department of Electrical and Computer Engineering The Ohio State University Columbus Ohio 432 Email: li.3822@osu.edu Yuejie Chi
More informationELE 538B: Sparsity, Structure and Inference. Super-Resolution. Yuxin Chen Princeton University, Spring 2017
ELE 538B: Sparsity, Structure and Inference Super-Resolution Yuxin Chen Princeton University, Spring 2017 Outline Classical methods for parameter estimation Polynomial method: Prony s method Subspace method:
More informationOptimization-based sparse recovery: Compressed sensing vs. super-resolution
Optimization-based sparse recovery: Compressed sensing vs. super-resolution Carlos Fernandez-Granda, Google Computational Photography and Intelligent Cameras, IPAM 2/5/2014 This work was supported by a
More informationFrom Compressed Sensing to Matrix Completion and Beyond. Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison
From Compressed Sensing to Matrix Completion and Beyond Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison Netflix Prize One million big ones! Given 100 million ratings on a
More informationCompressed Sensing Off the Grid
Compressed Sensing Off the Grid Gongguo Tang, Badri Narayan Bhaskar, Parikshit Shah, and Benjamin Recht Department of Electrical and Computer Engineering Department of Computer Sciences University of Wisconsin-adison
More informationStrengthened Sobolev inequalities for a random subspace of functions
Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)
More informationFast 2-D Direction of Arrival Estimation using Two-Stage Gridless Compressive Sensing
Fast 2-D Direction of Arrival Estimation using Two-Stage Gridless Compressive Sensing Mert Kalfa ASELSAN Research Center ASELSAN Inc. Ankara, TR-06370, Turkey Email: mkalfa@aselsan.com.tr H. Emre Güven
More informationLARGE SCALE 2D SPECTRAL COMPRESSED SENSING IN CONTINUOUS DOMAIN
LARGE SCALE 2D SPECTRAL COMPRESSED SENSING IN CONTINUOUS DOMAIN Jian-Feng Cai, Weiyu Xu 2, and Yang Yang 3 Department of Mathematics, Hong Kong University of Science and Technology, Clear Water Bay, Kowloon,
More informationSupport Detection in Super-resolution
Support Detection in Super-resolution Carlos Fernandez-Granda (Stanford University) 7/2/2013 SampTA 2013 Support Detection in Super-resolution C. Fernandez-Granda 1 / 25 Acknowledgements This work was
More informationThe convex algebraic geometry of linear inverse problems
The convex algebraic geometry of linear inverse problems The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published Publisher
More informationarxiv:submit/ [cs.it] 25 Jul 2012
Compressed Sensing off the Grid Gongguo Tang, Badri Narayan Bhaskar, Parikshit Shah, and Benjamin Recht University of Wisconsin-Madison July 25, 202 arxiv:submit/05227 [cs.it] 25 Jul 202 Abstract We consider
More informationSPECTRAL COMPRESSIVE SENSING WITH POLAR INTERPOLATION. Karsten Fyhn, Hamid Dadkhahi, Marco F. Duarte
SPECTRAL COMPRESSIVE SENSING WITH POLAR INTERPOLATION Karsten Fyhn, Hamid Dadkhahi, Marco F. Duarte Dept. of Electronic Systems, Aalborg University, Denmark. Dept. of Electrical and Computer Engineering,
More informationApproximate Support Recovery of Atomic Line Spectral Estimation: A Tale of Resolution and Precision
Approximate Support Recovery of Atomic Line Spectral Estimation: A Tale of Resolution and Precision Qiuwei Li and Gongguo Tang Department of Electrical Engineering and Computer Science, Colorado School
More informationECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis
ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear
More informationSparse DOA estimation with polynomial rooting
Sparse DOA estimation with polynomial rooting Angeliki Xenaki Department of Applied Mathematics and Computer Science Technical University of Denmark 28 Kgs. Lyngby, Denmark Email: anxe@dtu.dk Peter Gerstoft
More informationCompressed Sensing and Robust Recovery of Low Rank Matrices
Compressed Sensing and Robust Recovery of Low Rank Matrices M. Fazel, E. Candès, B. Recht, P. Parrilo Electrical Engineering, University of Washington Applied and Computational Mathematics Dept., Caltech
More informationThe convex algebraic geometry of rank minimization
The convex algebraic geometry of rank minimization Pablo A. Parrilo Laboratory for Information and Decision Systems Massachusetts Institute of Technology International Symposium on Mathematical Programming
More informationOff-the-Grid Line Spectrum Denoising and Estimation with Multiple Measurement Vectors
Off-the-Grid Line Spectrum Denoising and Estimation with Multiple Measurement Vectors Yuanxin Li and Yuejie Chi arxiv:48.4v [cs.it] Jul 5 Abstract Compressed Sensing suggests that the required number of
More informationCOMPRESSED SENSING (CS) is an emerging theory
LU et al.: DISTRIBUTED COMPRESSED SENSING OFF THE GRID Distributed Compressed Sensing off the Grid Zhenqi Lu*, ndong Ying, Sumxin Jiang, Peilin Liu, Member, IEEE, and Wenxian Yu, Member, IEEE arxiv:47.364v3
More informationMassive MIMO: Signal Structure, Efficient Processing, and Open Problems II
Massive MIMO: Signal Structure, Efficient Processing, and Open Problems II Mahdi Barzegar Communications and Information Theory Group (CommIT) Technische Universität Berlin Heisenberg Communications and
More informationInformation and Resolution
Information and Resolution (Invited Paper) Albert Fannjiang Department of Mathematics UC Davis, CA 95616-8633. fannjiang@math.ucdavis.edu. Abstract The issue of resolution with complex-field measurement
More informationAN ALGORITHM FOR EXACT SUPER-RESOLUTION AND PHASE RETRIEVAL
2014 IEEE International Conference on Acoustic, Speech and Signal Processing (ICASSP) AN ALGORITHM FOR EXACT SUPER-RESOLUTION AND PHASE RETRIEVAL Yuxin Chen Yonina C. Eldar Andrea J. Goldsmith Department
More informationGauge optimization and duality
1 / 54 Gauge optimization and duality Junfeng Yang Department of Mathematics Nanjing University Joint with Shiqian Ma, CUHK September, 2015 2 / 54 Outline Introduction Duality Lagrange duality Fenchel
More informationA Fast Algorithm for Reconstruction of Spectrally Sparse Signals in Super-Resolution
A Fast Algorithm for Reconstruction of Spectrally Sparse Signals in Super-Resolution Jian-Feng ai a, Suhui Liu a, and Weiyu Xu b a Department of Mathematics, University of Iowa, Iowa ity, IA 52242; b Department
More informationA Super-Resolution Algorithm for Multiband Signal Identification
A Super-Resolution Algorithm for Multiband Signal Identification Zhihui Zhu, Dehui Yang, Michael B. Wakin, and Gongguo Tang Department of Electrical Engineering, Colorado School of Mines {zzhu, dyang,
More informationECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference
ECE 18-898G: Special Topics in Signal Processing: Sparsity, Structure, and Inference Low-rank matrix recovery via convex relaxations Yuejie Chi Department of Electrical and Computer Engineering Spring
More informationA New Estimate of Restricted Isometry Constants for Sparse Solutions
A New Estimate of Restricted Isometry Constants for Sparse Solutions Ming-Jun Lai and Louis Y. Liu January 12, 211 Abstract We show that as long as the restricted isometry constant δ 2k < 1/2, there exist
More informationCompressed Sensing and Affine Rank Minimization Under Restricted Isometry
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 61, NO. 13, JULY 1, 2013 3279 Compressed Sensing Affine Rank Minimization Under Restricted Isometry T. Tony Cai Anru Zhang Abstract This paper establishes new
More informationCompressive Sensing of Sparse Tensor
Shmuel Friedland Univ. Illinois at Chicago Matheon Workshop CSA2013 December 12, 2013 joint work with Q. Li and D. Schonfeld, UIC Abstract Conventional Compressive sensing (CS) theory relies on data representation
More informationSolving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming)
Solving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming) Justin Romberg Georgia Tech, ECE Caltech ROM-GR Workshop June 7, 2013 Pasadena, California Linear
More informationOn the recovery of measures without separation conditions
Norbert Wiener Center Department of Mathematics University of Maryland, College Park http://www.norbertwiener.umd.edu Applied and Computational Mathematics Seminar Georgia Institute of Technology October
More informationDemixing Sines and Spikes: Robust Spectral Super-resolution in the Presence of Outliers
Demixing Sines and Spikes: Robust Spectral Super-resolution in the Presence of Outliers Carlos Fernandez-Granda, Gongguo Tang, Xiaodong Wang and Le Zheng September 6 Abstract We consider the problem of
More informationLinear System Identification via Atomic Norm Regularization
Linear System Identification via Atomic Norm Regularization Parikshit Shah, Badri Narayan Bhaskar, Gongguo Tang and Benjamin Recht University of Wisconsin-Madison Abstract This paper proposes a new algorithm
More informationRapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization
Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization Shuyang Ling Department of Mathematics, UC Davis Oct.18th, 2016 Shuyang Ling (UC Davis) 16w5136, Oaxaca, Mexico Oct.18th, 2016
More informationRandom projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016
Lecture notes 5 February 9, 016 1 Introduction Random projections Random projections are a useful tool in the analysis and processing of high-dimensional data. We will analyze two applications that use
More informationRobust Uncertainty Principles: Exact Signal Reconstruction from Highly Incomplete Frequency Information
1 Robust Uncertainty Principles: Exact Signal Reconstruction from Highly Incomplete Frequency Information Emmanuel Candès, California Institute of Technology International Conference on Computational Harmonic
More informationSuper-Resolution of Mutually Interfering Signals
Super-Resolution of utually Interfering Signals Yuanxin Li Department of Electrical and Computer Engineering The Ohio State University Columbus Ohio 430 Email: li.38@osu.edu Yuejie Chi Department of Electrical
More informationSparse and Low-Rank Matrix Decompositions
Forty-Seventh Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 30 - October 2, 2009 Sparse and Low-Rank Matrix Decompositions Venkat Chandrasekaran, Sujay Sanghavi, Pablo A. Parrilo,
More informationSparse Recovery Beyond Compressed Sensing
Sparse Recovery Beyond Compressed Sensing Carlos Fernandez-Granda www.cims.nyu.edu/~cfgranda Applied Math Colloquium, MIT 4/30/2018 Acknowledgements Project funded by NSF award DMS-1616340 Separable Nonlinear
More informationLecture Notes 9: Constrained Optimization
Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form
More informationThree Generalizations of Compressed Sensing
Thomas Blumensath School of Mathematics The University of Southampton June, 2010 home prev next page Compressed Sensing and beyond y = Φx + e x R N or x C N x K is K-sparse and x x K 2 is small y R M or
More informationMinimizing the Difference of L 1 and L 2 Norms with Applications
1/36 Minimizing the Difference of L 1 and L 2 Norms with Department of Mathematical Sciences University of Texas Dallas May 31, 2017 Partially supported by NSF DMS 1522786 2/36 Outline 1 A nonconvex approach:
More informationThree Recent Examples of The Effectiveness of Convex Programming in the Information Sciences
Three Recent Examples of The Effectiveness of Convex Programming in the Information Sciences Emmanuel Candès International Symposium on Information Theory, Istanbul, July 2013 Three stories about a theme
More informationNoisy Signal Recovery via Iterative Reweighted L1-Minimization
Noisy Signal Recovery via Iterative Reweighted L1-Minimization Deanna Needell UC Davis / Stanford University Asilomar SSC, November 2009 Problem Background Setup 1 Suppose x is an unknown signal in R d.
More informationBlind Deconvolution Using Convex Programming. Jiaming Cao
Blind Deconvolution Using Convex Programming Jiaming Cao Problem Statement The basic problem Consider that the received signal is the circular convolution of two vectors w and x, both of length L. How
More informationSIGNALS with sparse representations can be recovered
IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER 2015 1497 Cramér Rao Bound for Sparse Signals Fitting the Low-Rank Model with Small Number of Parameters Mahdi Shaghaghi, Student Member, IEEE,
More informationConstrained optimization
Constrained optimization DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Compressed sensing Convex constrained
More informationCompressed Sensing, Sparse Inversion, and Model Mismatch
Compressed Sensing, Sparse Inversion, and Model Mismatch Ali Pezeshki, Yuejie Chi, Louis L. Scharf, and Edwin K. P. Chong Abstract The advent of compressed sensing theory has revolutionized our view of
More informationSelf-Calibration and Biconvex Compressive Sensing
Self-Calibration and Biconvex Compressive Sensing Shuyang Ling Department of Mathematics, UC Davis July 12, 2017 Shuyang Ling (UC Davis) SIAM Annual Meeting, 2017, Pittsburgh July 12, 2017 1 / 22 Acknowledgements
More informationOptimisation Combinatoire et Convexe.
Optimisation Combinatoire et Convexe. Low complexity models, l 1 penalties. A. d Aspremont. M1 ENS. 1/36 Today Sparsity, low complexity models. l 1 -recovery results: three approaches. Extensions: matrix
More informationRobust Principal Component Analysis
ELE 538B: Mathematics of High-Dimensional Data Robust Principal Component Analysis Yuxin Chen Princeton University, Fall 2018 Disentangling sparse and low-rank matrices Suppose we are given a matrix M
More informationSparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing
Sparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing Bobak Nazer and Robert D. Nowak University of Wisconsin, Madison Allerton 10/01/10 Motivation: Virus-Host Interaction
More informationExact Joint Sparse Frequency Recovery via Optimization Methods
IEEE TRANSACTIONS ON SIGNAL PROCESSING Exact Joint Sparse Frequency Recovery via Optimization Methods Zai Yang, Member, IEEE, and Lihua Xie, Fellow, IEEE arxiv:45.6585v [cs.it] 3 May 6 Abstract Frequency
More informationFast Angular Synchronization for Phase Retrieval via Incomplete Information
Fast Angular Synchronization for Phase Retrieval via Incomplete Information Aditya Viswanathan a and Mark Iwen b a Department of Mathematics, Michigan State University; b Department of Mathematics & Department
More informationSpectral Compressed Sensing via Structured Matrix Completion
Yuxin Chen yxchen@stanford.edu Department of Electrical Engineering, Stanford University, Stanford, CA 94305, USA Yuejie Chi chi@ece.osu.edu Electrical and Computer Engineering, The Ohio State University,
More informationNew Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit
New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence
More informationLow-Rank Matrix Recovery
ELE 538B: Mathematics of High-Dimensional Data Low-Rank Matrix Recovery Yuxin Chen Princeton University, Fall 2018 Outline Motivation Problem setup Nuclear norm minimization RIP and low-rank matrix recovery
More informationarxiv: v1 [math.oc] 11 Jun 2009
RANK-SPARSITY INCOHERENCE FOR MATRIX DECOMPOSITION VENKAT CHANDRASEKARAN, SUJAY SANGHAVI, PABLO A. PARRILO, S. WILLSKY AND ALAN arxiv:0906.2220v1 [math.oc] 11 Jun 2009 Abstract. Suppose we are given a
More informationComputable Performance Analysis of Sparsity Recovery with Applications
Computable Performance Analysis of Sparsity Recovery with Applications Arye Nehorai Preston M. Green Department of Electrical & Systems Engineering Washington University in St. Louis, USA European Signal
More informationMatrix completion: Fundamental limits and efficient algorithms. Sewoong Oh Stanford University
Matrix completion: Fundamental limits and efficient algorithms Sewoong Oh Stanford University 1 / 35 Low-rank matrix completion Low-rank Data Matrix Sparse Sampled Matrix Complete the matrix from small
More informationSignal Recovery, Uncertainty Relations, and Minkowski Dimension
Signal Recovery, Uncertainty Relations, and Minkowski Dimension Helmut Bőlcskei ETH Zurich December 2013 Joint work with C. Aubel, P. Kuppinger, G. Pope, E. Riegler, D. Stotz, and C. Studer Aim of this
More informationAn Extended Frank-Wolfe Method, with Application to Low-Rank Matrix Completion
An Extended Frank-Wolfe Method, with Application to Low-Rank Matrix Completion Robert M. Freund, MIT joint with Paul Grigas (UC Berkeley) and Rahul Mazumder (MIT) CDC, December 2016 1 Outline of Topics
More informationEfficient Spectral Methods for Learning Mixture Models!
Efficient Spectral Methods for Learning Mixture Models Qingqing Huang 2016 February Laboratory for Information & Decision Systems Based on joint works with Munther Dahleh, Rong Ge, Sham Kakade, Greg Valiant.
More informationCompressive Line Spectrum Estimation with Clustering and Interpolation
Compressive Line Spectrum Estimation with Clustering and Interpolation Dian Mo Department of Electrical and Computer Engineering University of Massachusetts Amherst, MA, 01003 mo@umass.edu Marco F. Duarte
More informationRobust Recovery of Positive Stream of Pulses
1 Robust Recovery of Positive Stream of Pulses Tamir Bendory arxiv:1503.0878v3 [cs.it] 9 Jan 017 Abstract The problem of estimating the delays and amplitudes of a positive stream of pulses appears in many
More informationarxiv: v1 [cs.it] 21 Feb 2013
q-ary Compressive Sensing arxiv:30.568v [cs.it] Feb 03 Youssef Mroueh,, Lorenzo Rosasco, CBCL, CSAIL, Massachusetts Institute of Technology LCSL, Istituto Italiano di Tecnologia and IIT@MIT lab, Istituto
More informationarxiv: v4 [cs.it] 21 Dec 2017
Atomic Norm Minimization for Modal Analysis from Random and Compressed Samples Shuang Li, Dehui Yang, Gongguo Tang, and Michael B. Wakin December 5, 7 arxiv:73.938v4 cs.it] Dec 7 Abstract Modal analysis
More informationGeneralized Orthogonal Matching Pursuit- A Review and Some
Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents
More informationCSC 576: Variants of Sparse Learning
CSC 576: Variants of Sparse Learning Ji Liu Department of Computer Science, University of Rochester October 27, 205 Introduction Our previous note basically suggests using l norm to enforce sparsity in
More informationCompressive Two-Dimensional Harmonic Retrieval via Atomic Norm Minimization
1030 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL 63, NO 4, FEBRUARY 15, 2015 Compressive Two-Dimensional Harmonic Retrieval via Atomic Norm Minimization Yuejie Chi, Member, IEEE, Yuxin Chen, Student Member,
More informationAn iterative hard thresholding estimator for low rank matrix recovery
An iterative hard thresholding estimator for low rank matrix recovery Alexandra Carpentier - based on a joint work with Arlene K.Y. Kim Statistical Laboratory, Department of Pure Mathematics and Mathematical
More informationTractable Upper Bounds on the Restricted Isometry Constant
Tractable Upper Bounds on the Restricted Isometry Constant Alex d Aspremont, Francis Bach, Laurent El Ghaoui Princeton University, École Normale Supérieure, U.C. Berkeley. Support from NSF, DHS and Google.
More informationIEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER
IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER 2015 1239 Preconditioning for Underdetermined Linear Systems with Sparse Solutions Evaggelia Tsiligianni, StudentMember,IEEE, Lisimachos P. Kondi,
More informationMultipath Matching Pursuit
Multipath Matching Pursuit Submitted to IEEE trans. on Information theory Authors: S. Kwon, J. Wang, and B. Shim Presenter: Hwanchol Jang Multipath is investigated rather than a single path for a greedy
More informationLecture: Introduction to Compressed Sensing Sparse Recovery Guarantees
Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin
More informationGREEDY SIGNAL RECOVERY REVIEW
GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin
More informationChapter 3 Compressed Sensing, Sparse Inversion, and Model Mismatch
Chapter 3 Compressed Sensing, Sparse Inversion, and Model Mismatch Ali Pezeshki, Yuejie Chi, Louis L. Scharf, and Edwin K.P. Chong Abstract The advent of compressed sensing theory has revolutionized our
More informationProbabilistic Low-Rank Matrix Completion with Adaptive Spectral Regularization Algorithms
Probabilistic Low-Rank Matrix Completion with Adaptive Spectral Regularization Algorithms François Caron Department of Statistics, Oxford STATLEARN 2014, Paris April 7, 2014 Joint work with Adrien Todeschini,
More informationUniqueness Conditions For Low-Rank Matrix Recovery
Claremont Colleges Scholarship @ Claremont CMC Faculty Publications and Research CMC Faculty Scholarship 3-28-2011 Uniqueness Conditions For Low-Rank Matrix Recovery Yonina C. Eldar Israel Institute of
More informationCompressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach
Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach Asilomar 2011 Jason T. Parker (AFRL/RYAP) Philip Schniter (OSU) Volkan Cevher (EPFL) Problem Statement Traditional
More informationRecovery of Simultaneously Structured Models using Convex Optimization
Recovery of Simultaneously Structured Models using Convex Optimization Maryam Fazel University of Washington Joint work with: Amin Jalali (UW), Samet Oymak and Babak Hassibi (Caltech) Yonina Eldar (Technion)
More informationAnalysis of Robust PCA via Local Incoherence
Analysis of Robust PCA via Local Incoherence Huishuai Zhang Department of EECS Syracuse University Syracuse, NY 3244 hzhan23@syr.edu Yi Zhou Department of EECS Syracuse University Syracuse, NY 3244 yzhou35@syr.edu
More informationRecovering any low-rank matrix, provably
Recovering any low-rank matrix, provably Rachel Ward University of Texas at Austin October, 2014 Joint work with Yudong Chen (U.C. Berkeley), Srinadh Bhojanapalli and Sujay Sanghavi (U.T. Austin) Matrix
More informationThe Analysis Cosparse Model for Signals and Images
The Analysis Cosparse Model for Signals and Images Raja Giryes Computer Science Department, Technion. The research leading to these results has received funding from the European Research Council under
More informationSimultaneous Sparsity
Simultaneous Sparsity Joel A. Tropp Anna C. Gilbert Martin J. Strauss {jtropp annacg martinjs}@umich.edu Department of Mathematics The University of Michigan 1 Simple Sparse Approximation Work in the d-dimensional,
More informationPhase recovery with PhaseCut and the wavelet transform case
Phase recovery with PhaseCut and the wavelet transform case Irène Waldspurger Joint work with Alexandre d Aspremont and Stéphane Mallat Introduction 2 / 35 Goal : Solve the non-linear inverse problem Reconstruct
More informationGaussian Phase Transitions and Conic Intrinsic Volumes: Steining the Steiner formula
Gaussian Phase Transitions and Conic Intrinsic Volumes: Steining the Steiner formula Larry Goldstein, University of Southern California Nourdin GIoVAnNi Peccati Luxembourg University University British
More informationSolving Corrupted Quadratic Equations, Provably
Solving Corrupted Quadratic Equations, Provably Yuejie Chi London Workshop on Sparse Signal Processing September 206 Acknowledgement Joint work with Yuanxin Li (OSU), Huishuai Zhuang (Syracuse) and Yingbin
More informationarxiv: v1 [cs.it] 4 Nov 2017
Separation-Free Super-Resolution from Compressed Measurements is Possible: an Orthonormal Atomic Norm Minimization Approach Weiyu Xu Jirong Yi Soura Dasgupta Jian-Feng Cai arxiv:17111396v1 csit] 4 Nov
More informationThe Convex Geometry of Linear Inverse Problems
Found Comput Math 2012) 12:805 849 DOI 10.1007/s10208-012-9135-7 The Convex Geometry of Linear Inverse Problems Venkat Chandrasekaran Benjamin Recht Pablo A. Parrilo Alan S. Willsky Received: 2 December
More informationExact Low Tubal Rank Tensor Recovery from Gaussian Measurements
Exact Low Tubal Rank Tensor Recovery from Gaussian Measurements Canyi Lu, Jiashi Feng 2, Zhouchen Li,4 Shuicheng Yan 5,2 Department of Electrical and Computer Engineering, Carnegie Mellon University 2
More informationSuper-Resolution of Point Sources via Convex Programming
Super-Resolution of Point Sources via Convex Programming Carlos Fernandez-Granda July 205; Revised December 205 Abstract We consider the problem of recovering a signal consisting of a superposition of
More information+1 (626) Dwight Way Berkeley, CA 94704
Samet Oymak sametoymak@gmail.com The Voleon Group +1 (626) 720-2114 2170 Dwight Way www.sametoymak.com Berkeley, CA 94704 ACADEMIC EXPERIENCE University of California, Berkeley (Sept. 2014 June 2015) Postdoctoral
More information