Going off the grid. Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison
|
|
- Matilda Pitts
- 5 years ago
- Views:
Transcription
1 Going off the grid Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison Joint work with Badri Bhaskar Parikshit Shah Gonnguo Tang
2 We live in a continuous world... But we work with digital computers... What is the price of living on the grid?
3 imaging astronomy seismology spectroscopy x(t) = kx j= c j e i2 f jt DOA Estimation Sampling GPS Radar Ultrasound
4 imaging astronomy seismology spectroscopy x(t) = kx c j g(t j ) j= DOA Estimation Sampling GPS Radar Ultrasound
5 imaging astronomy seismology spectroscopy x(t) = kx j= c j g(t j )e i2 f jt DOA Estimation Sampling GPS Radar Ultrasound
6 imaging astronomy seismology spectroscopy x(s) =PSF~ 8 < : kx j= 9 = c j (s s j ) ; DOA Estimation Sampling GPS Radar Ultrasound
7 imaging astronomy seismology spectroscopy ˆx(!) = d PSF(!) kx c j e i2!s j j= DOA Estimation Sampling GPS Radar Ultrasound
8 Observe a sparse combination of sinusoids sx x m = c k e i2 mu k for some u k 2 [0, ) k= Spectrum Estimation: find a combination of sinusoids agreeing with time series data Classic techniques require sensitive frequency localization or knowledge of model order. Can we leverage convex geometry for new algorithms for and analysis of spectrum estimation? Sparse combinations from the set A = 82 >< 6 4 >: e e e e i i2 +i i4 +i. i2 n+i : 2 [0, 2 ), 2 [0, ) 9 >= >;
9 Linear Inverse Problems Find me a solution of y = x m x n, m<n Of the infinite collection of solutions, which one should we pick? Leverage structure: Sparsity Rank Smoothness Symmetry How do we design algorithms to solve underdetermined systems problems with priors?
10 Sparsity -sparse vectors of Euclidean norm Convex hull is the unit ball of the l norm - -
11 x 2 Φx=y minimize kxk subject to x = y x
12 Parsimonious Models rank model weights atoms Search for best linear combination of fewest atoms rank = fewest atoms needed to describe the model
13 Atomic Norm Minimization Chandrasekaran, Recht, Parrilo, and Willsky 200 IDEA: minimize subject to kzk A z = y Generalizes existing, powerful methods Rigorous formula for developing new analysis algorithms Tightest bounds on number of measurements needed for model recovery in all common models One algorithm prototype for many data-mining applications
14 Spectrum Estimation Observe a sparse combination of sinusoids sx x? m = c? ke i2 mu? k for some u? k 2 [0, ) k= Observe: y = x? +! (signal plus noise) Atomic Set 82 >< A = 6 4 >: e e e e i i2 +i i4 +i i2 n+i : 2 [0, 2 ), 2 [0, ) 9 >= >; Classical techniques (Prony, Matrix Pencil, MUSIC, ESPRIT, Cadzow), use the fact that noiseless moment matrices are low-rank: rx k= k e k i e 2 ki e 3 ki e k i e 2 ki e 3 ki = µ 0 µ µ 2 µ 3 µ µ 0 µ µ 2 µ 2 µ µ 0 µ µ 3 µ 2 µ µ
15 Atomic Norm for Spectrum Estimation IDEA: minimize kzk A subject to k z yk apple Atomic Set: 82 >< A = 6 4 >: e e e e i i2 +i i4 +i. i2 n+i : 2 [0, 2 ), 2 [0, ) 9 >= >; How do we solve the optimization problem? Can we approximate the true signal from partial and noisy measurements? Can we estimate the frequencies from partial and noisy measurements?
16 Which atomic norm for sinusoids? x m = sx c k e i2 mu k for some u k 2 [0, ) k= Assume rx k= c k e i2 u k e i4 u k e i6 u k c k are positive for simplicity e i2 u k e i4 u k e i6 u k = x 0 x x 2 x 3 x x 0 x x 2 x 2 x x 0 x x 3 x 2 x x Convex hull is characterized by linear matrix inequalities (Toeplitz positive semidefinite) kxk A =inf 2 t + 2 w 0 : apple t x x toep(w) 0 Moment Curve of [t,t 2,t 3,t 4,...], t 2 S
17 A A [ { A} conv(a) conv(a [ { A})
18 cos(2 u k)/2 + cos(2 u 2 k)/2 cos(2 u k)/2 cos(2 u 2 k)/2 u =0.43 u 2 =0.4
19 Nearly optimal rates Observe a sparse combination of sinusoids sx x? m = c? ke i2 mu? k for some u? k 2 [0, ) k= Observe: y = x? +! Assume frequencies are far apart: (signal plus noise) min p6=q d(u p,u q ) 4 n Solve: ˆx = arg min x 2 kx yk2 2 + µkxk A Error Rate: n kˆx x? k 2 2 apple C 2 s log(n) n No algorithm can do better than apple E n kˆx x? k 2 C 0 2 s log(n/s) 2 n even if the frequencies are wellseparated No algorithm can do better than n kˆx x? k 2 2 C 0 2 s n even if we knew all of the frequencies (uk * )
20 Mean Square Error Performance Profile P ( ) Frequencies generated at random with /n separation. Random phases, fading amplitudes. Cadzow and MUSIC provided model order. AST and LASSO estimate noise power. Lower is better Performance profile over all parameter values and settings For algorithm s: P s ( )= # {p 2 P : MSE s(p) apple #(P) Higher is better min s MSE s (p)}
21 Extracting the decomposition How do you extract the frequencies? Look at the dual norm: kvk A = maxha, vi = max a2a u2[0,) nx k= v k e 2 iku Dual norm is the maximum modulus of a polynomial At optimality, maximum modulus is attained at the support of the signal
22 Localization Guarantees Dual Polynomial Magnitude Estimated Frequency 0.6/n True Frequency Spurious Frequency Frequency Spurious Amplitudes l: ˆf l F ĉ l C k 2 (n) n Weighted Frequency Deviation l: ˆf l N j ĉ l f j T d(f j, ˆf l ) 2 C 2 k 2 (n) n Near region approximation c j l: ˆfl Nj ĉ l C 3 k 2 (n) n
23 Frequency Localization Spurious Amplitudes Weighted Frequency Deviation Near region approximation l: ˆf l F ĉ l C k 2 (n) n l: ˆf l N j ĉ l f j T d(f j, ˆf l ) 2 C 2 k 2 (n) n c j l: ˆfl Nj ĉ l C 3 k 2 (n) n Ps(β) Ps(β) Ps(β) AST MUSIC Cadzow β AST MUSIC Cadzow AST MUSIC Cadzow β AST MUSIC Cadzow AST MUSIC Cadzow β AST MUSIC Cadzow m k = 32 m k = 32 m3 3 2 k = SNR (db) SNR (db) SNR (db)
24 Incomplete Data/Random Sampling Observe a random subset of samples T {0,,...,n } On the grid, Candes, Romberg & Tao (2004) Off the grid, new, Compressed Sensing extended to the wide continuous domain Recover the missing part by solving minimize kzk A z subject to z T = x T. Full observation Random sampling Extract frequencies from the dual optimal solution
25 Exact Recovery (off the grid) { " Theorem: (Candès and Fernandez- Granda 202) A line spectrum with minimum frequency separation " > 4/s can be recovered from the first 2s Fourier coefficients via atomic norm minimization. sx x m = c k exp(2 imu k ) k= WANT {u k,c k } Theorem: (Tang, Bhaskar, Shah, and R. 202) A line spectrum with minimum frequency separation " > 4/n can be recovered from most subsets of the first n Fourier coefficients of size at least O(s log(s) log(n)). s random samples are better than s equispaced samples. On a grid, this is just compressed sensing Off the grid, this is new No balancing of coherence as n grows.
26 Performance Profile Solution Accuracy 0.8 noiseless 0.6 P(β) β SDP BP 4 BP 6 BP 64 SDP BP:4x BP:64x
27 imaging astronomy seismology ˆx(!) = d PSF(!) kx c j e i2!s j j= x(t) = kx c j g(t j )e i2 f jt j= GPS Radar Ultrasound
28 Summary and Future ` Work Unified approach for continuous problems in estimation. Obviates incoherence and basis mismatch New insight into bounds on estimation error and exact recovery through convex duality and algebraic geometry State-of-the-art results in classic fields of spectrum estimation and system identification (ask me later!) Recovery of general sparse signal trains Scaling atomic norm algorithms More atomization in signal processing... (moments, PDEs, deep atoms)
29 Acknowledgements Work developed with Venkat Chandrasekaran, Babak Hassibi (Caltech), Weiyu Xu (Iowa), Pablo A. Parrilo, Alan Willsky (MIT), Maryam Fazel (Washington) Badri Bhaskar, Rob Nowak, Nikhil Rao, Gongguo Tang (Wisconsin). For all references, see:
30 References Atomic norm denoising with applications to line spectral estimation. Badri Narayan Bhaskar, Gongguo Tang, and Benjamin Recht. Submitted to IEEE Transactions on Signal Processing Compressed sensing off the grid. Gongguo Tang, Badri Bhaskar, Parikshit Shah, and Benjamin Recht. Submitted to IEEE Transactions on Information Theory Near Minimax Line Spectral Estimation. Gongguo Tang, Badri Narayan Bhaskar, and Benjamin Recht. Submitted to IEEE Transactions on Information Theory, 203. The convex geometry of inverse problems. Venkat Chandrasekaran, Benjamin Recht, Pablo Parrilo, and Alan Willsky. To Appear in Foundations on Computational Mathematics All references can be found at
Going off the grid. Benjamin Recht University of California, Berkeley. Joint work with Badri Bhaskar Parikshit Shah Gonnguo Tang
Going off the grid Benjamin Recht University of California, Berkeley Joint work with Badri Bhaskar Parikshit Shah Gonnguo Tang imaging astronomy seismology spectroscopy x(t) = kx j= c j e i2 f jt DOA Estimation
More informationROBUST BLIND SPIKES DECONVOLUTION. Yuejie Chi. Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 43210
ROBUST BLIND SPIKES DECONVOLUTION Yuejie Chi Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 4 ABSTRACT Blind spikes deconvolution, or blind super-resolution, deals with
More informationarxiv: v2 [cs.it] 16 Feb 2013
Atomic norm denoising with applications to line spectral estimation Badri Narayan Bhaskar, Gongguo Tang, and Benjamin Recht Department of Electrical and Computer Engineering Department of Computer Sciences
More informationCombining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation
UIUC CSL Mar. 24 Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation Yuejie Chi Department of ECE and BMI Ohio State University Joint work with Yuxin Chen (Stanford).
More informationTowards a Mathematical Theory of Super-resolution
Towards a Mathematical Theory of Super-resolution Carlos Fernandez-Granda www.stanford.edu/~cfgranda/ Information Theory Forum, Information Systems Laboratory, Stanford 10/18/2013 Acknowledgements This
More informationFrom Compressed Sensing to Matrix Completion and Beyond. Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison
From Compressed Sensing to Matrix Completion and Beyond Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison Netflix Prize One million big ones! Given 100 million ratings on a
More informationThe convex algebraic geometry of linear inverse problems
The convex algebraic geometry of linear inverse problems The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published Publisher
More informationSuper-resolution via Convex Programming
Super-resolution via Convex Programming Carlos Fernandez-Granda (Joint work with Emmanuel Candès) Structure and Randomness in System Identication and Learning, IPAM 1/17/2013 1/17/2013 1 / 44 Index 1 Motivation
More informationOptimization-based sparse recovery: Compressed sensing vs. super-resolution
Optimization-based sparse recovery: Compressed sensing vs. super-resolution Carlos Fernandez-Granda, Google Computational Photography and Intelligent Cameras, IPAM 2/5/2014 This work was supported by a
More informationSparse Parameter Estimation: Compressed Sensing meets Matrix Pencil
Sparse Parameter Estimation: Compressed Sensing meets Matrix Pencil Yuejie Chi Departments of ECE and BMI The Ohio State University Colorado School of Mines December 9, 24 Page Acknowledgement Joint work
More informationCompressed Sensing Off the Grid
Compressed Sensing Off the Grid Gongguo Tang, Badri Narayan Bhaskar, Parikshit Shah, and Benjamin Recht Department of Electrical and Computer Engineering Department of Computer Sciences University of Wisconsin-adison
More informationParameter Estimation for Mixture Models via Convex Optimization
Parameter Estimation for Mixture Models via Convex Optimization Yuanxin Li Department of Electrical and Computer Engineering The Ohio State University Columbus Ohio 432 Email: li.3822@osu.edu Yuejie Chi
More informationELE 538B: Sparsity, Structure and Inference. Super-Resolution. Yuxin Chen Princeton University, Spring 2017
ELE 538B: Sparsity, Structure and Inference Super-Resolution Yuxin Chen Princeton University, Spring 2017 Outline Classical methods for parameter estimation Polynomial method: Prony s method Subspace method:
More informationarxiv:submit/ [cs.it] 25 Jul 2012
Compressed Sensing off the Grid Gongguo Tang, Badri Narayan Bhaskar, Parikshit Shah, and Benjamin Recht University of Wisconsin-Madison July 25, 202 arxiv:submit/05227 [cs.it] 25 Jul 202 Abstract We consider
More informationThe convex algebraic geometry of rank minimization
The convex algebraic geometry of rank minimization Pablo A. Parrilo Laboratory for Information and Decision Systems Massachusetts Institute of Technology International Symposium on Mathematical Programming
More informationMassive MIMO: Signal Structure, Efficient Processing, and Open Problems II
Massive MIMO: Signal Structure, Efficient Processing, and Open Problems II Mahdi Barzegar Communications and Information Theory Group (CommIT) Technische Universität Berlin Heisenberg Communications and
More informationCompressed Sensing and Robust Recovery of Low Rank Matrices
Compressed Sensing and Robust Recovery of Low Rank Matrices M. Fazel, E. Candès, B. Recht, P. Parrilo Electrical Engineering, University of Washington Applied and Computational Mathematics Dept., Caltech
More informationLARGE SCALE 2D SPECTRAL COMPRESSED SENSING IN CONTINUOUS DOMAIN
LARGE SCALE 2D SPECTRAL COMPRESSED SENSING IN CONTINUOUS DOMAIN Jian-Feng Cai, Weiyu Xu 2, and Yang Yang 3 Department of Mathematics, Hong Kong University of Science and Technology, Clear Water Bay, Kowloon,
More informationSupport Detection in Super-resolution
Support Detection in Super-resolution Carlos Fernandez-Granda (Stanford University) 7/2/2013 SampTA 2013 Support Detection in Super-resolution C. Fernandez-Granda 1 / 25 Acknowledgements This work was
More informationSparse DOA estimation with polynomial rooting
Sparse DOA estimation with polynomial rooting Angeliki Xenaki Department of Applied Mathematics and Computer Science Technical University of Denmark 28 Kgs. Lyngby, Denmark Email: anxe@dtu.dk Peter Gerstoft
More informationLinear System Identification via Atomic Norm Regularization
Linear System Identification via Atomic Norm Regularization Parikshit Shah, Badri Narayan Bhaskar, Gongguo Tang and Benjamin Recht University of Wisconsin-Madison Abstract This paper proposes a new algorithm
More informationApproximate Support Recovery of Atomic Line Spectral Estimation: A Tale of Resolution and Precision
Approximate Support Recovery of Atomic Line Spectral Estimation: A Tale of Resolution and Precision Qiuwei Li and Gongguo Tang Department of Electrical Engineering and Computer Science, Colorado School
More informationECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis
ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear
More informationOptimisation Combinatoire et Convexe.
Optimisation Combinatoire et Convexe. Low complexity models, l 1 penalties. A. d Aspremont. M1 ENS. 1/36 Today Sparsity, low complexity models. l 1 -recovery results: three approaches. Extensions: matrix
More informationFast 2-D Direction of Arrival Estimation using Two-Stage Gridless Compressive Sensing
Fast 2-D Direction of Arrival Estimation using Two-Stage Gridless Compressive Sensing Mert Kalfa ASELSAN Research Center ASELSAN Inc. Ankara, TR-06370, Turkey Email: mkalfa@aselsan.com.tr H. Emre Güven
More informationSparse and Low-Rank Matrix Decompositions
Forty-Seventh Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 30 - October 2, 2009 Sparse and Low-Rank Matrix Decompositions Venkat Chandrasekaran, Sujay Sanghavi, Pablo A. Parrilo,
More informationInformation and Resolution
Information and Resolution (Invited Paper) Albert Fannjiang Department of Mathematics UC Davis, CA 95616-8633. fannjiang@math.ucdavis.edu. Abstract The issue of resolution with complex-field measurement
More informationStrengthened Sobolev inequalities for a random subspace of functions
Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)
More informationNoisy Signal Recovery via Iterative Reweighted L1-Minimization
Noisy Signal Recovery via Iterative Reweighted L1-Minimization Deanna Needell UC Davis / Stanford University Asilomar SSC, November 2009 Problem Background Setup 1 Suppose x is an unknown signal in R d.
More informationAN ALGORITHM FOR EXACT SUPER-RESOLUTION AND PHASE RETRIEVAL
2014 IEEE International Conference on Acoustic, Speech and Signal Processing (ICASSP) AN ALGORITHM FOR EXACT SUPER-RESOLUTION AND PHASE RETRIEVAL Yuxin Chen Yonina C. Eldar Andrea J. Goldsmith Department
More informationDemixing Sines and Spikes: Robust Spectral Super-resolution in the Presence of Outliers
Demixing Sines and Spikes: Robust Spectral Super-resolution in the Presence of Outliers Carlos Fernandez-Granda, Gongguo Tang, Xiaodong Wang and Le Zheng September 6 Abstract We consider the problem of
More informationThe Convex Geometry of Linear Inverse Problems
Found Comput Math 2012) 12:805 849 DOI 10.1007/s10208-012-9135-7 The Convex Geometry of Linear Inverse Problems Venkat Chandrasekaran Benjamin Recht Pablo A. Parrilo Alan S. Willsky Received: 2 December
More informationThree Generalizations of Compressed Sensing
Thomas Blumensath School of Mathematics The University of Southampton June, 2010 home prev next page Compressed Sensing and beyond y = Φx + e x R N or x C N x K is K-sparse and x x K 2 is small y R M or
More informationA Fast Algorithm for Reconstruction of Spectrally Sparse Signals in Super-Resolution
A Fast Algorithm for Reconstruction of Spectrally Sparse Signals in Super-Resolution Jian-Feng ai a, Suhui Liu a, and Weiyu Xu b a Department of Mathematics, University of Iowa, Iowa ity, IA 52242; b Department
More informationIntroduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011
Compressed Sensing Huichao Xue CS3750 Fall 2011 Table of Contents Introduction From News Reports Abstract Definition How it works A review of L 1 norm The Algorithm Backgrounds for underdetermined linear
More informationSPECTRAL COMPRESSIVE SENSING WITH POLAR INTERPOLATION. Karsten Fyhn, Hamid Dadkhahi, Marco F. Duarte
SPECTRAL COMPRESSIVE SENSING WITH POLAR INTERPOLATION Karsten Fyhn, Hamid Dadkhahi, Marco F. Duarte Dept. of Electronic Systems, Aalborg University, Denmark. Dept. of Electrical and Computer Engineering,
More informationA Super-Resolution Algorithm for Multiband Signal Identification
A Super-Resolution Algorithm for Multiband Signal Identification Zhihui Zhu, Dehui Yang, Michael B. Wakin, and Gongguo Tang Department of Electrical Engineering, Colorado School of Mines {zzhu, dyang,
More informationThree Recent Examples of The Effectiveness of Convex Programming in the Information Sciences
Three Recent Examples of The Effectiveness of Convex Programming in the Information Sciences Emmanuel Candès International Symposium on Information Theory, Istanbul, July 2013 Three stories about a theme
More informationarxiv: v4 [cs.it] 21 Dec 2017
Atomic Norm Minimization for Modal Analysis from Random and Compressed Samples Shuang Li, Dehui Yang, Gongguo Tang, and Michael B. Wakin December 5, 7 arxiv:73.938v4 cs.it] Dec 7 Abstract Modal analysis
More informationA New Estimate of Restricted Isometry Constants for Sparse Solutions
A New Estimate of Restricted Isometry Constants for Sparse Solutions Ming-Jun Lai and Louis Y. Liu January 12, 211 Abstract We show that as long as the restricted isometry constant δ 2k < 1/2, there exist
More informationOn the recovery of measures without separation conditions
Norbert Wiener Center Department of Mathematics University of Maryland, College Park http://www.norbertwiener.umd.edu Applied and Computational Mathematics Seminar Georgia Institute of Technology October
More informationEfficient Spectral Methods for Learning Mixture Models!
Efficient Spectral Methods for Learning Mixture Models Qingqing Huang 2016 February Laboratory for Information & Decision Systems Based on joint works with Munther Dahleh, Rong Ge, Sham Kakade, Greg Valiant.
More informationRobust Recovery of Positive Stream of Pulses
1 Robust Recovery of Positive Stream of Pulses Tamir Bendory arxiv:1503.0878v3 [cs.it] 9 Jan 017 Abstract The problem of estimating the delays and amplitudes of a positive stream of pulses appears in many
More informationExact Joint Sparse Frequency Recovery via Optimization Methods
IEEE TRANSACTIONS ON SIGNAL PROCESSING Exact Joint Sparse Frequency Recovery via Optimization Methods Zai Yang, Member, IEEE, and Lihua Xie, Fellow, IEEE arxiv:45.6585v [cs.it] 3 May 6 Abstract Frequency
More informationECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference
ECE 18-898G: Special Topics in Signal Processing: Sparsity, Structure, and Inference Low-rank matrix recovery via convex relaxations Yuejie Chi Department of Electrical and Computer Engineering Spring
More informationOff-the-Grid Line Spectrum Denoising and Estimation with Multiple Measurement Vectors
Off-the-Grid Line Spectrum Denoising and Estimation with Multiple Measurement Vectors Yuanxin Li and Yuejie Chi arxiv:48.4v [cs.it] Jul 5 Abstract Compressed Sensing suggests that the required number of
More informationMinimizing the Difference of L 1 and L 2 Norms with Applications
1/36 Minimizing the Difference of L 1 and L 2 Norms with Department of Mathematical Sciences University of Texas Dallas May 31, 2017 Partially supported by NSF DMS 1522786 2/36 Outline 1 A nonconvex approach:
More informationRobust Principal Component Analysis
ELE 538B: Mathematics of High-Dimensional Data Robust Principal Component Analysis Yuxin Chen Princeton University, Fall 2018 Disentangling sparse and low-rank matrices Suppose we are given a matrix M
More informationCompressive Line Spectrum Estimation with Clustering and Interpolation
Compressive Line Spectrum Estimation with Clustering and Interpolation Dian Mo Department of Electrical and Computer Engineering University of Massachusetts Amherst, MA, 01003 mo@umass.edu Marco F. Duarte
More informationConstrained optimization
Constrained optimization DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Compressed sensing Convex constrained
More informationGauge optimization and duality
1 / 54 Gauge optimization and duality Junfeng Yang Department of Mathematics Nanjing University Joint with Shiqian Ma, CUHK September, 2015 2 / 54 Outline Introduction Duality Lagrange duality Fenchel
More informationarxiv: v1 [math.oc] 11 Jun 2009
RANK-SPARSITY INCOHERENCE FOR MATRIX DECOMPOSITION VENKAT CHANDRASEKARAN, SUJAY SANGHAVI, PABLO A. PARRILO, S. WILLSKY AND ALAN arxiv:0906.2220v1 [math.oc] 11 Jun 2009 Abstract. Suppose we are given a
More informationINDUSTRIAL MATHEMATICS INSTITUTE. B.S. Kashin and V.N. Temlyakov. IMI Preprint Series. Department of Mathematics University of South Carolina
INDUSTRIAL MATHEMATICS INSTITUTE 2007:08 A remark on compressed sensing B.S. Kashin and V.N. Temlyakov IMI Preprint Series Department of Mathematics University of South Carolina A remark on compressed
More informationAn iterative hard thresholding estimator for low rank matrix recovery
An iterative hard thresholding estimator for low rank matrix recovery Alexandra Carpentier - based on a joint work with Arlene K.Y. Kim Statistical Laboratory, Department of Pure Mathematics and Mathematical
More informationCompressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles
Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional
More informationLecture Notes 9: Constrained Optimization
Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form
More informationLatent Variable Graphical Model Selection Via Convex Optimization
Latent Variable Graphical Model Selection Via Convex Optimization The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published
More informationCovariance Sketching
Covariance Sketching Gautam Dasarathy, Parikshit Shah, Badri Narayan Bhaskar, Robert Nowak University of Wisconsin - Madison Abstract Learning covariance matrices from highdimensional data is an important
More informationGuaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
Forty-Fifth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 26-28, 27 WeA3.2 Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization Benjamin
More informationSparse Recovery Beyond Compressed Sensing
Sparse Recovery Beyond Compressed Sensing Carlos Fernandez-Granda www.cims.nyu.edu/~cfgranda Applied Math Colloquium, MIT 4/30/2018 Acknowledgements Project funded by NSF award DMS-1616340 Separable Nonlinear
More informationCOMPRESSED SENSING (CS) is an emerging theory
LU et al.: DISTRIBUTED COMPRESSED SENSING OFF THE GRID Distributed Compressed Sensing off the Grid Zhenqi Lu*, ndong Ying, Sumxin Jiang, Peilin Liu, Member, IEEE, and Wenxian Yu, Member, IEEE arxiv:47.364v3
More informationSignal Recovery, Uncertainty Relations, and Minkowski Dimension
Signal Recovery, Uncertainty Relations, and Minkowski Dimension Helmut Bőlcskei ETH Zurich December 2013 Joint work with C. Aubel, P. Kuppinger, G. Pope, E. Riegler, D. Stotz, and C. Studer Aim of this
More informationLecture 24 May 30, 2018
Stats 3C: Theory of Statistics Spring 28 Lecture 24 May 3, 28 Prof. Emmanuel Candes Scribe: Martin J. Zhang, Jun Yan, Can Wang, and E. Candes Outline Agenda: High-dimensional Statistical Estimation. Lasso
More informationGRIDLESS COMPRESSIVE-SENSING METHODS FOR FREQUENCY ESTIMATION: POINTS OF TANGENCY AND LINKS TO BASICS
GRIDLESS COMPRESSIVE-SENSING METHODS FOR FREQUENCY ESTIMATION: POINTS OF TANGENCY AND LINKS TO BASICS Petre Stoica, Gongguo Tang, Zai Yang, Dave Zachariah Dept. Information Technology Uppsala University
More informationLecture: Introduction to Compressed Sensing Sparse Recovery Guarantees
Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin
More informationRecovery of Low-Rank Plus Compressed Sparse Matrices with Application to Unveiling Traffic Anomalies
July 12, 212 Recovery of Low-Rank Plus Compressed Sparse Matrices with Application to Unveiling Traffic Anomalies Morteza Mardani Dept. of ECE, University of Minnesota, Minneapolis, MN 55455 Acknowledgments:
More informationSpectral k-support Norm Regularization
Spectral k-support Norm Regularization Andrew McDonald Department of Computer Science, UCL (Joint work with Massimiliano Pontil and Dimitris Stamos) 25 March, 2015 1 / 19 Problem: Matrix Completion Goal:
More informationLow-Rank Matrix Recovery
ELE 538B: Mathematics of High-Dimensional Data Low-Rank Matrix Recovery Yuxin Chen Princeton University, Fall 2018 Outline Motivation Problem setup Nuclear norm minimization RIP and low-rank matrix recovery
More informationExact Low Tubal Rank Tensor Recovery from Gaussian Measurements
Exact Low Tubal Rank Tensor Recovery from Gaussian Measurements Canyi Lu, Jiashi Feng 2, Zhouchen Li,4 Shuicheng Yan 5,2 Department of Electrical and Computer Engineering, Carnegie Mellon University 2
More informationThe Convex Geometry of Linear Inverse Problems
The Convex Geometry of Linear Inverse Problems Venkat Chandrasekaran m, Benjamin Recht w, Pablo A. Parrilo m, and Alan S. Willsky m m Laboratory for Information and Decision Systems Department of Electrical
More informationIntroduction to Sparsity in Signal Processing
1 Introduction to Sparsity in Signal Processing Ivan Selesnick Polytechnic Institute of New York University Brooklyn, New York selesi@poly.edu 212 2 Under-determined linear equations Consider a system
More informationAnalysis of Robust PCA via Local Incoherence
Analysis of Robust PCA via Local Incoherence Huishuai Zhang Department of EECS Syracuse University Syracuse, NY 3244 hzhan23@syr.edu Yi Zhou Department of EECS Syracuse University Syracuse, NY 3244 yzhou35@syr.edu
More informationRandom projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016
Lecture notes 5 February 9, 016 1 Introduction Random projections Random projections are a useful tool in the analysis and processing of high-dimensional data. We will analyze two applications that use
More informationCompressed Sensing and Neural Networks
and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications
More informationSuper-Resolution of Mutually Interfering Signals
Super-Resolution of utually Interfering Signals Yuanxin Li Department of Electrical and Computer Engineering The Ohio State University Columbus Ohio 430 Email: li.38@osu.edu Yuejie Chi Department of Electrical
More informationSparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing
Sparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing Bobak Nazer and Robert D. Nowak University of Wisconsin, Madison Allerton 10/01/10 Motivation: Virus-Host Interaction
More informationSuper-Resolution MIMO Radar
Super-Resolution MIMO Radar Reinhard Heckel Department of Electrical Engineering and Computer Sciences UC Berkeley, Berkeley, CA May 10, 2016 Abstract A multiple input, multiple output (MIMO) radar emits
More informationConvex Relaxation for Low-Dimensional Representation: Phase Transitions and Limitations
Convex Relaxation for Low-Dimensional Representation: Phase Transitions and Limitations Thesis by Samet Oymak In Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy California
More informationExact Low-rank Matrix Recovery via Nonconvex M p -Minimization
Exact Low-rank Matrix Recovery via Nonconvex M p -Minimization Lingchen Kong and Naihua Xiu Department of Applied Mathematics, Beijing Jiaotong University, Beijing, 100044, People s Republic of China E-mail:
More informationRapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization
Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization Shuyang Ling Department of Mathematics, UC Davis Oct.18th, 2016 Shuyang Ling (UC Davis) 16w5136, Oaxaca, Mexico Oct.18th, 2016
More informationPrimal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector
Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Muhammad Salman Asif Thesis Committee: Justin Romberg (Advisor), James McClellan, Russell Mersereau School of Electrical and Computer
More informationGeneralized Line Spectral Estimation via Convex Optimization
Generalized Line Spectral Estimation via Convex Optimization Reinhard Heckel and Mahdi Soltanolkotabi September 26, 206 Abstract Line spectral estimation is the problem of recovering the frequencies and
More informationarxiv: v1 [cs.it] 4 Nov 2017
Separation-Free Super-Resolution from Compressed Measurements is Possible: an Orthonormal Atomic Norm Minimization Approach Weiyu Xu Jirong Yi Soura Dasgupta Jian-Feng Cai arxiv:17111396v1 csit] 4 Nov
More informationTractable performance bounds for compressed sensing.
Tractable performance bounds for compressed sensing. Alex d Aspremont, Francis Bach, Laurent El Ghaoui Princeton University, École Normale Supérieure/INRIA, U.C. Berkeley. Support from NSF, DHS and Google.
More informationSolving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming)
Solving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming) Justin Romberg Georgia Tech, ECE Caltech ROM-GR Workshop June 7, 2013 Pasadena, California Linear
More informationSuper-Resolution of Point Sources via Convex Programming
Super-Resolution of Point Sources via Convex Programming Carlos Fernandez-Granda July 205; Revised December 205 Abstract We consider the problem of recovering a signal consisting of a superposition of
More informationIntroduction to Sparsity in Signal Processing
1 Introduction to Sparsity in Signal Processing Ivan Selesnick Polytechnic Institute of New York University Brooklyn, New York selesi@poly.edu 212 2 Under-determined linear equations Consider a system
More informationCSC 576: Variants of Sparse Learning
CSC 576: Variants of Sparse Learning Ji Liu Department of Computer Science, University of Rochester October 27, 205 Introduction Our previous note basically suggests using l norm to enforce sparsity in
More informationSketching Sparse Covariance Matrices and Graphs
Sketching Sparse Covariance Matrices and Graphs Gautam Dasarathy 1, Pariskhit Shah 2, Badri Narayan Bhaskar 1, and Robert Nowak 1 1 Department of Electrical and Computer Engineering, University of Wisconsin
More informationTractable Upper Bounds on the Restricted Isometry Constant
Tractable Upper Bounds on the Restricted Isometry Constant Alex d Aspremont, Francis Bach, Laurent El Ghaoui Princeton University, École Normale Supérieure, U.C. Berkeley. Support from NSF, DHS and Google.
More informationConvex optimization. Javier Peña Carnegie Mellon University. Universidad de los Andes Bogotá, Colombia September 2014
Convex optimization Javier Peña Carnegie Mellon University Universidad de los Andes Bogotá, Colombia September 2014 1 / 41 Convex optimization Problem of the form where Q R n convex set: min x f(x) x Q,
More informationGaussian Phase Transitions and Conic Intrinsic Volumes: Steining the Steiner formula
Gaussian Phase Transitions and Conic Intrinsic Volumes: Steining the Steiner formula Larry Goldstein, University of Southern California Nourdin GIoVAnNi Peccati Luxembourg University University British
More informationGREEDY SIGNAL RECOVERY REVIEW
GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin
More informationNear Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing
Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas M.Vidyasagar@utdallas.edu www.utdallas.edu/ m.vidyasagar
More informationRecovery of Simultaneously Structured Models using Convex Optimization
Recovery of Simultaneously Structured Models using Convex Optimization Maryam Fazel University of Washington Joint work with: Amin Jalali (UW), Samet Oymak and Babak Hassibi (Caltech) Yonina Eldar (Technion)
More informationSeismic data interpolation and denoising using SVD-free low-rank matrix factorization
Seismic data interpolation and denoising using SVD-free low-rank matrix factorization R. Kumar, A.Y. Aravkin,, H. Mansour,, B. Recht and F.J. Herrmann Dept. of Earth and Ocean sciences, University of British
More informationSelf-Calibration and Biconvex Compressive Sensing
Self-Calibration and Biconvex Compressive Sensing Shuyang Ling Department of Mathematics, UC Davis July 12, 2017 Shuyang Ling (UC Davis) SIAM Annual Meeting, 2017, Pittsburgh July 12, 2017 1 / 22 Acknowledgements
More informationOn the coherence barrier and analogue problems in compressed sensing
On the coherence barrier and analogue problems in compressed sensing Clarice Poon University of Cambridge June 1, 2017 Joint work with: Ben Adcock Anders Hansen Bogdan Roman (Simon Fraser) (Cambridge)
More informationSpectral Compressed Sensing via Structured Matrix Completion
Yuxin Chen yxchen@stanford.edu Department of Electrical Engineering, Stanford University, Stanford, CA 94305, USA Yuejie Chi chi@ece.osu.edu Electrical and Computer Engineering, The Ohio State University,
More informationRecovering any low-rank matrix, provably
Recovering any low-rank matrix, provably Rachel Ward University of Texas at Austin October, 2014 Joint work with Yudong Chen (U.C. Berkeley), Srinadh Bhojanapalli and Sujay Sanghavi (U.T. Austin) Matrix
More information