Basis Pursuit Denoising and the Dantzig Selector
|
|
- Spencer Maxwell
- 6 years ago
- Views:
Transcription
1 BPDN and DS p. 1/16 Basis Pursuit Denoising and the Dantzig Selector West Coast Optimization Meeting University of Washington Seattle, WA, April 28 29, 2007 Michael Friedlander and Michael Saunders Dept of Computer Science Dept of Management Sci & Eng University of British Columbia Stanford University Vancouver, BC V6K 2C6 Stanford, CA
2 BPDN and DS p. 2/16 Abstract Many imaging and compressed sensing applications seek sparse solutions to under-detered least-squares problems. The Lasso and Basis Pursuit Denoising (BPDN) approaches of bounding the 1-norm of the solution have led to several computational algorithms.
3 BPDN and DS p. 2/16 Abstract Many imaging and compressed sensing applications seek sparse solutions to under-detered least-squares problems. The Lasso and Basis Pursuit Denoising (BPDN) approaches of bounding the 1-norm of the solution have led to several computational algorithms. PDCO uses an interior method to handle general linear constraints and bounds. Homotopy, LARS, OMP, and STOMP are specialized active-set methods for handling the implicit bounds. l1 ls and GPSR are further recent entries in the l1-regularized least-squares competition, both based on bound-constrained optimization.
4 BPDN and DS p. 2/16 Abstract Many imaging and compressed sensing applications seek sparse solutions to under-detered least-squares problems. The Lasso and Basis Pursuit Denoising (BPDN) approaches of bounding the 1-norm of the solution have led to several computational algorithms. PDCO uses an interior method to handle general linear constraints and bounds. Homotopy, LARS, OMP, and STOMP are specialized active-set methods for handling the implicit bounds. l1 ls and GPSR are further recent entries in the l1-regularized least-squares competition, both based on bound-constrained optimization. The Dantzig Selector of Candes and Tao is promising in its production of sparse solutions using only linear programg. Again, interior or active-set (simplex) methods may be used. We compare the BPDN and DS approaches via their dual problems and some numerical examples.
5 BPDN and DS p. 3/16 Sparse x Lasso(t) Tibshirani 1996 x 1 2 b Ax 2 2 s.t. x 1 t A = explicit
6 BPDN and DS p. 3/16 Sparse x Lasso(t) Tibshirani 1996 x 1 2 b Ax 2 2 s.t. x 1 t A = explicit Basis Pursuit Chen, Donoho & S 2001 x x 1 s.t. Ax = b A = fast operator
7 BPDN and DS p. 3/16 Sparse x Lasso(t) Tibshirani 1996 x 1 2 b Ax 2 2 s.t. x 1 t A = explicit BPDN(λ) Chen, Donoho & S 2001 x 1 2 b Ax λ x 1 A = fast operator
8 BPDN and DS p. 4/16 BP and BPDN Algorithms OMP Davis, Mallat et al 1997 Greedy BPDN-interior Chen, Donoho & S, 1998 Interior, CG PDSCO, PDCO S 1997, 2002 Interior, LSQR BCR Sardy, Bruce & Tseng 2000 Orthogonal blocks Homotopy Osborne et al 2000 Active-set, all λ LARS Efron, Hastie et al 2004 Active-set, all λ STOMP Donoho, Tsaig et al 2006 Double greedy l1 ls Kim, Koh et al 2007 Primal barrier, PCG GPSR Figueiredo, Nowak & Wright 2007 Gradient-projection
9 BPDN and DS p. 5/16 Basis Pursuit Denoising (BPDN) Chen, Donoho and S 1998 Pure LS x,r 1 2 r 2 2 s.t. r = b Ax
10 BPDN and DS p. 5/16 Basis Pursuit Denoising (BPDN) Chen, Donoho and S 1998 Pure LS x,r 1 2 r 2 2 s.t. r = b Ax If x not unique, need regularization
11 BPDN and DS p. 5/16 Basis Pursuit Denoising (BPDN) Chen, Donoho and S 1998 Pure LS x,r 1 2 r 2 2 s.t. r = b Ax If x not unique, need regularization BPDN(λ) x,r λ x r 2 2 s.t. r = b Ax
12 BPDN and DS p. 5/16 Basis Pursuit Denoising (BPDN) Chen, Donoho and S 1998 Pure LS x,r 1 2 r 2 2 s.t. r = b Ax If x not unique, need regularization BPDN(λ) x,r λ x r 2 2 s.t. r = b Ax smaller x 1 bigger r 2
13 BPDN and DS p. 6/16 The Dantzig Selector (DS) Candès and Tao 2007 Pure LS A T r = 0, r = b Ax
14 BPDN and DS p. 6/16 The Dantzig Selector (DS) Candès and Tao 2007 Pure LS A T r = 0, r = b Ax Plausible regularization x,r x 1 s.t. A T r = 0, r = b Ax
15 BPDN and DS p. 6/16 The Dantzig Selector (DS) Candès and Tao 2007 Pure LS A T r = 0, r = b Ax Plausible regularization x,r x 1 s.t. A T r = 0, r = b Ax DS(λ) x,r x 1 s.t. A T r λ, r = b Ax
16 BPDN and DS p. 6/16 The Dantzig Selector (DS) Candès and Tao 2007 Pure LS A T r = 0, r = b Ax Plausible regularization x,r x 1 s.t. A T r = 0, r = b Ax DS(λ) x,r x 1 s.t. A T r λ, r = b Ax smaller x 1 bigger A T r
17 BPDN and DS p. 7/16 BP Denoising and the Dantzig Selector Dual problems BPDN(λ) x,r λ x r 2 2 s.t. r = b Ax QP
18 BPDN and DS p. 7/16 BP Denoising and the Dantzig Selector Dual problems BPDN(λ) x,r λ x r 2 2 s.t. r = b Ax QP DS(λ) x,r x 1 s.t. A T r λ, r = b Ax LP
19 BPDN and DS p. 7/16 BP Denoising and the Dantzig Selector Dual problems BPDN(λ) x,r λ x r 2 2 s.t. r = b Ax QP DS(λ) x,r x 1 s.t. A T r λ, r = b Ax LP BPdual(λ) r b T r r 2 2 s.t. A T r λ
20 BPDN and DS p. 7/16 BP Denoising and the Dantzig Selector Dual problems BPDN(λ) x,r λ x r 2 2 s.t. r = b Ax QP DS(λ) x,r x 1 s.t. A T r λ, r = b Ax LP BPdual(λ) r b T r r 2 2 s.t. A T r λ DSdual(λ) r,z b T r + λ z 1 s.t. A T r λ, r = Az
21 BPDN and DS p. 8/16 BPDN(λ) implementation Chen, Donoho & S 1998 v,w,r s.t. λ1 T (v + w) rt r [ ] A A v + r = b, v, w 0 w 2007: Apply PDCO (Matlab primal-dual interior method)
22 BPDN and DS p. 8/16 BPDN(λ) implementation Chen, Donoho & S 1998 v,w,r s.t. λ1 T (v + w) rt r [ ] A A v + r = b, v, w 0 w 2007: Apply PDCO (Matlab primal-dual interior method) Dense A in test problems Dense Cholesky Double-handling of A [ A ] A D 1 D 2 AT A T could be coded as A(D 1 + D 2 )A T
23 BPDN and DS p. 9/16 DS(λ) implementation Candès and Tao 2007 x,u 1 T u s.t. u x u, λ1 A T (b Ax) λ1, Apply l1dantzig pd (Matlab primal-dual interior method) Romberg 2005 Part of l 1 -magic Candes 2006
24 BPDN and DS p. 9/16 DS(λ) implementation Candès and Tao 2007 x,u 1 T u s.t. u x u, λ1 A T (b Ax) λ1, Apply l1dantzig pd (Matlab primal-dual interior method) Romberg 2005 Part of l 1 -magic Candes 2006 Dense A in test problems Dense Cholesky on A T (AD 1 A T )A + D 2 (much bigger) + D 2
25 BPDN and DS p. 10/16 Two other DS LP implementations Introduce s = A T r DS1 Interior v,w,s s.t. 1 T (v + w) [ ] A T A A T A I w v = A T b, v, w 0, s λ s
26 BPDN and DS p. 10/16 Two other DS LP implementations Introduce s = A T r DS1 Interior v,w,s s.t. 1 T (v + w) [ ] A T A A T A I w v = A T b, v, w 0, s λ s DS2 Interior, Simplex v,w,r,s 1T (v + w) s.t. A A I A T I v w rs = b, v, w 0, s λ 0
27 A, b depend on dimensions m, n, T Test data rand ( state,0); % initialize generators randn( state,0); x = zeros(n,1); % random +/-1 signal q = randperm(m); x(q(1:t)) = sign(randn(t,1)); [A,R] = qr(randn(n,m),0); A = A ; % m x n measurement mtx sigma = 0.005; b = A*x + sigma*randn(m,1); % noisy observations A dense AA T = I T components x j ±1 For example m = 500 n = 2000 T = 80 λ = 3e-3 BPDN and DS p. 11/16
28 DS vs BPDN DS with interior methods vs BPDN interior and greedy DS2 pdco DS2 cplex barrier DS1 pdco DS l1magic BPDN pdco BPDN greedy 800 secs m (n > 4m) BPDN and DS p. 12/16
29 BPDN and DS p. 13/16 CPLEX dual simplex on DS2 with loose and tight tols sizes tol = 0.1 tol = m n T itns time S itns time S Too many simplex iterations, too many x j 0
30 More small values more simplex iterations and more time per iteration BPDN and DS p. 14/16 CPLEX dual simplex on DS2 Large and small x j 1.0 Dual simplex, tol = 0.1 Dual simplex, tol =
31 BPDN and DS p. 15/16 References E. Candès, l1 -magic, E. Candès and T. Tao, The Dantzig selector: Statistical estimation when p n, Annals of Statistics, to appear (2007). S. S. Chen, D. L. Donoho, and M. A. Saunders, Atomic decomposition by basis pursuit, SIAM Review, 43 (2001). B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani, Least angle regression, Ann. Statist., 32 (2004). M. R. Osborne, B. Presnell, and B. A. Turlach, The Lasso and its dual, J. of Computational and Graphical Statistics, 9 (2000). M. R. Osborne, B. Presnell, and B. A. Turlach, A new approach to variable selection in least squares problems, IMA J. of Numerical Analysis, 20 (2000). M. A. Saunders, PDCO. Matlab software for convex optimization, R. Tibshirani, Regression shrinkage and selection via the lasso, J. Roy. Statist. Soc. Ser. B, 58 (1996). Y. Tsaig, Sparse Solution of Underdetered Linear Systems: Algorithms and Applications, PhD thesis, Stanford University, 2007.
32 BPDN and DS p. 16/16 Many thanks Michael
33 BPDN and DS p. 16/16 Many thanks Michael and Jim, Terry, Paul
Computing approximate PageRank vectors by Basis Pursuit Denoising
Computing approximate PageRank vectors by Basis Pursuit Denoising Michael Saunders Systems Optimization Laboratory, Stanford University Joint work with Holly Jin, LinkedIn Corp SIAM Annual Meeting San
More informationMS&E 318 (CME 338) Large-Scale Numerical Optimization. A Lasso Solver
Stanford University, Dept of Management Science and Engineering MS&E 318 (CME 338) Large-Scale Numerical Optimization Instructor: Michael Saunders Spring 2011 Final Project Due Friday June 10 A Lasso Solver
More informationSparsity in Underdetermined Systems
Sparsity in Underdetermined Systems Department of Statistics Stanford University August 19, 2005 Classical Linear Regression Problem X n y p n 1 > Given predictors and response, y Xβ ε = + ε N( 0, σ 2
More informationAn Homotopy Algorithm for the Lasso with Online Observations
An Homotopy Algorithm for the Lasso with Online Observations Pierre J. Garrigues Department of EECS Redwood Center for Theoretical Neuroscience University of California Berkeley, CA 94720 garrigue@eecs.berkeley.edu
More informationPrimal-Dual First-Order Methods for a Class of Cone Programming
Primal-Dual First-Order Methods for a Class of Cone Programming Zhaosong Lu March 9, 2011 Abstract In this paper we study primal-dual first-order methods for a class of cone programming problems. In particular,
More informationAn Alternating Direction Method for Finding Dantzig Selectors
An Alternating Direction Method for Finding Dantzig Selectors Zhaosong Lu Ting Kei Pong Yong Zhang November 19, 21 Abstract In this paper, we study the alternating direction method for finding the Dantzig
More informationGradient Descent with Sparsification: An iterative algorithm for sparse recovery with restricted isometry property
: An iterative algorithm for sparse recovery with restricted isometry property Rahul Garg grahul@us.ibm.com Rohit Khandekar rohitk@us.ibm.com IBM T. J. Watson Research Center, 0 Kitchawan Road, Route 34,
More informationSparse representation classification and positive L1 minimization
Sparse representation classification and positive L1 minimization Cencheng Shen Joint Work with Li Chen, Carey E. Priebe Applied Mathematics and Statistics Johns Hopkins University, August 5, 2014 Cencheng
More informationLarge-Scale L1-Related Minimization in Compressive Sensing and Beyond
Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March
More informationOrthogonal Matching Pursuit for Sparse Signal Recovery With Noise
Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published
More informationOn Algorithms for Solving Least Squares Problems under an L 1 Penalty or an L 1 Constraint
On Algorithms for Solving Least Squares Problems under an L 1 Penalty or an L 1 Constraint B.A. Turlach School of Mathematics and Statistics (M19) The University of Western Australia 35 Stirling Highway,
More informationGradient based method for cone programming with application to large-scale compressed sensing
Gradient based method for cone programming with application to large-scale compressed sensing Zhaosong Lu September 3, 2008 (Revised: September 17, 2008) Abstract In this paper, we study a gradient based
More informationOptimization Algorithms for Compressed Sensing
Optimization Algorithms for Compressed Sensing Stephen Wright University of Wisconsin-Madison SIAM Gator Student Conference, Gainesville, March 2009 Stephen Wright (UW-Madison) Optimization and Compressed
More informationApplications with l 1 -Norm Objective Terms
Applications with l 1 -Norm Objective Terms Stephen Wright University of Wisconsin-Madison Huatulco, January 2007 Stephen Wright (UW-Madison) Applications with l 1 -Norm Objective Terms Huatulco, January
More informationMS&E 318 (CME 338) Large-Scale Numerical Optimization
Stanford University, Management Science & Engineering (and ICME MS&E 38 (CME 338 Large-Scale Numerical Optimization Course description Instructor: Michael Saunders Spring 28 Notes : Review The course teaches
More informationSparse & Redundant Representations by Iterated-Shrinkage Algorithms
Sparse & Redundant Representations by Michael Elad * The Computer Science Department The Technion Israel Institute of technology Haifa 3000, Israel 6-30 August 007 San Diego Convention Center San Diego,
More informationElaine T. Hale, Wotao Yin, Yin Zhang
, Wotao Yin, Yin Zhang Department of Computational and Applied Mathematics Rice University McMaster University, ICCOPT II-MOPTA 2007 August 13, 2007 1 with Noise 2 3 4 1 with Noise 2 3 4 1 with Noise 2
More informationInverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France
Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Structure of the tutorial Session 1: Introduction to inverse problems & sparse
More informationSparse Optimization: Algorithms and Applications. Formulating Sparse Optimization. Motivation. Stephen Wright. Caltech, 21 April 2007
Sparse Optimization: Algorithms and Applications Stephen Wright 1 Motivation and Introduction 2 Compressed Sensing Algorithms University of Wisconsin-Madison Caltech, 21 April 2007 3 Image Processing +Mario
More informationParNes: A rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals
Preprint manuscript No. (will be inserted by the editor) ParNes: A rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals Ming Gu ek-heng im Cinna Julie Wu Received:
More informationPathwise coordinate optimization
Stanford University 1 Pathwise coordinate optimization Jerome Friedman, Trevor Hastie, Holger Hoefling, Robert Tibshirani Stanford University Acknowledgements: Thanks to Stephen Boyd, Michael Saunders,
More informationTractable Upper Bounds on the Restricted Isometry Constant
Tractable Upper Bounds on the Restricted Isometry Constant Alex d Aspremont, Francis Bach, Laurent El Ghaoui Princeton University, École Normale Supérieure, U.C. Berkeley. Support from NSF, DHS and Google.
More informationLinear Regression with Strongly Correlated Designs Using Ordered Weigthed l 1
Linear Regression with Strongly Correlated Designs Using Ordered Weigthed l 1 ( OWL ) Regularization Mário A. T. Figueiredo Instituto de Telecomunicações and Instituto Superior Técnico, Universidade de
More informationMaking Flippy Floppy
Making Flippy Floppy James V. Burke UW Mathematics jvburke@uw.edu Aleksandr Y. Aravkin IBM, T.J.Watson Research sasha.aravkin@gmail.com Michael P. Friedlander UBC Computer Science mpf@cs.ubc.ca Current
More informationThe lasso an l 1 constraint in variable selection
The lasso algorithm is due to Tibshirani who realised the possibility for variable selection of imposing an l 1 norm bound constraint on the variables in least squares models and then tuning the model
More informationDepartment of Computer Science, University of British Columbia DRAFT Technical Report TR-2010-nn, July 30, 2012
Department of Computer Science, University of British Columbia DRAFT Technical Report TR-2010-nn, July 30, 2012 A DUAL ACTIVE-SET QUADRATIC PROGRAMMING METHOD FOR FINDING SPARSE LEAST-SQUARES SOLUTIONS
More informationHomotopy methods based on l 0 norm for the compressed sensing problem
Homotopy methods based on l 0 norm for the compressed sensing problem Wenxing Zhu, Zhengshan Dong Center for Discrete Mathematics and Theoretical Computer Science, Fuzhou University, Fuzhou 350108, China
More informationGradient Projection for Sparse Reconstruction: Application to Compressed Sensing and Other Inverse Problems
SUBMITTED FOR PUBLICATION; 27. Gradient Projection for Sparse Reconstruction: Application to Compressed Sensing and Other Inverse Problems Mário A. T. Figueiredo, Robert D. Nowak, Stephen J. Wright Abstract
More informationSMO vs PDCO for SVM: Sequential Minimal Optimization vs Primal-Dual interior method for Convex Objectives for Support Vector Machines
vs for SVM: Sequential Minimal Optimization vs Primal-Dual interior method for Convex Objectives for Support Vector Machines Ding Ma Michael Saunders Working paper, January 5 Introduction In machine learning,
More informationGeneralized Orthogonal Matching Pursuit- A Review and Some
Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents
More informationSaharon Rosset 1 and Ji Zhu 2
Aust. N. Z. J. Stat. 46(3), 2004, 505 510 CORRECTED PROOF OF THE RESULT OF A PREDICTION ERROR PROPERTY OF THE LASSO ESTIMATOR AND ITS GENERALIZATION BY HUANG (2003) Saharon Rosset 1 and Ji Zhu 2 IBM T.J.
More informationCompressed Sensing with Quantized Measurements
Compressed Sensing with Quantized Measurements Argyrios Zymnis, Stephen Boyd, and Emmanuel Candès April 28, 2009 Abstract We consider the problem of estimating a sparse signal from a set of quantized,
More informationCOMPARATIVE ANALYSIS OF ORTHOGONAL MATCHING PURSUIT AND LEAST ANGLE REGRESSION
COMPARATIVE ANALYSIS OF ORTHOGONAL MATCHING PURSUIT AND LEAST ANGLE REGRESSION By Mazin Abdulrasool Hameed A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for
More informationMaking Flippy Floppy
Making Flippy Floppy James V. Burke UW Mathematics jvburke@uw.edu Aleksandr Y. Aravkin IBM, T.J.Watson Research sasha.aravkin@gmail.com Michael P. Friedlander UBC Computer Science mpf@cs.ubc.ca Vietnam
More informationSolving l 1 Regularized Least Square Problems with Hierarchical Decomposition
Solving l 1 Least Square s with 1 mzhong1@umd.edu 1 AMSC and CSCAMM University of Maryland College Park Project for AMSC 663 October 2 nd, 2012 Outline 1 The 2 Outline 1 The 2 Compressed Sensing Example
More informationNecessary and sufficient conditions of solution uniqueness in l 1 minimization
1 Necessary and sufficient conditions of solution uniqueness in l 1 minimization Hui Zhang, Wotao Yin, and Lizhi Cheng arxiv:1209.0652v2 [cs.it] 18 Sep 2012 Abstract This paper shows that the solutions
More informationAn Introduction to Sparse Approximation
An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,
More informationAdaptive Lasso for correlated predictors
Adaptive Lasso for correlated predictors Keith Knight Department of Statistics University of Toronto e-mail: keith@utstat.toronto.edu This research was supported by NSERC of Canada. OUTLINE 1. Introduction
More informationExperiments with iterative computation of search directions within interior methods for constrained optimization
Experiments with iterative computation of search directions within interior methods for constrained optimization Santiago Akle and Michael Saunders Institute for Computational Mathematics and Engineering
More informationDiscussion of Least Angle Regression
Discussion of Least Angle Regression David Madigan Rutgers University & Avaya Labs Research Piscataway, NJ 08855 madigan@stat.rutgers.edu Greg Ridgeway RAND Statistics Group Santa Monica, CA 90407-2138
More informationMultiple Change Point Detection by Sparse Parameter Estimation
Multiple Change Point Detection by Sparse Parameter Estimation Department of Econometrics Fac. of Economics and Management University of Defence Brno, Czech Republic Dept. of Appl. Math. and Comp. Sci.
More informationarxiv: v1 [math.st] 8 Jan 2008
arxiv:0801.1158v1 [math.st] 8 Jan 2008 Hierarchical selection of variables in sparse high-dimensional regression P. J. Bickel Department of Statistics University of California at Berkeley Y. Ritov Department
More informationSparsity and Compressed Sensing
Sparsity and Compressed Sensing Jalal Fadili Normandie Université-ENSICAEN, GREYC Mathematical coffees 2017 Recap: linear inverse problems Dictionary Sensing Sensing Sensing = y m 1 H m n y y 2 R m H A
More informationNecessary and Sufficient Conditions of Solution Uniqueness in 1-Norm Minimization
Noname manuscript No. (will be inserted by the editor) Necessary and Sufficient Conditions of Solution Uniqueness in 1-Norm Minimization Hui Zhang Wotao Yin Lizhi Cheng Received: / Accepted: Abstract This
More informationl 1 -magic : Recovery of Sparse Signals via Convex Programming
l -magic : Recovery of Sparse Signals via Convex Programming Emmanuel Candès and Justin Romberg Caltech October 005 Seven problems A recent series of papers [3 8] develops a theory of signal recovery from
More informationBias-free Sparse Regression with Guaranteed Consistency
Bias-free Sparse Regression with Guaranteed Consistency Wotao Yin (UCLA Math) joint with: Stanley Osher, Ming Yan (UCLA) Feng Ruan, Jiechao Xiong, Yuan Yao (Peking U) UC Riverside, STATS Department March
More informationSparse Signal Reconstruction with Hierarchical Decomposition
Sparse Signal Reconstruction with Hierarchical Decomposition Ming Zhong Advisor: Dr. Eitan Tadmor AMSC and CSCAMM University of Maryland College Park College Park, Maryland 20742 USA November 8, 2012 Abstract
More informationBayesian Compressive Sensing and Projection Optimization
Shihao Ji shji@ece.duke.edu Lawrence Carin lcarin@ece.duke.edu Department of Electrical and Computer Engineering, Duke University, Durham, NC 2778 USA Abstract This paper introduces a new problem for which
More informationNear Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing
Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas M.Vidyasagar@utdallas.edu www.utdallas.edu/ m.vidyasagar
More informationDepartment of Computer Science, University of British Columbia Technical Report TR , January 2008
Department of Computer Science, University of British Columbia Technical Report TR-2008-01, January 2008 PROBING THE PARETO FRONTIER FOR BASIS PURSUIT SOLUTIONS EWOUT VAN DEN BERG AND MICHAEL P. FRIEDLANDER
More informationConvex Optimization and l 1 -minimization
Convex Optimization and l 1 -minimization Sangwoon Yun Computational Sciences Korea Institute for Advanced Study December 11, 2009 2009 NIMS Thematic Winter School Outline I. Convex Optimization II. l
More informationCanonical Problem Forms. Ryan Tibshirani Convex Optimization
Canonical Problem Forms Ryan Tibshirani Convex Optimization 10-725 Last time: optimization basics Optimization terology (e.g., criterion, constraints, feasible points, solutions) Properties and first-order
More informationSparsity and the Lasso
Sparsity and the Lasso Statistical Machine Learning, Spring 205 Ryan Tibshirani (with Larry Wasserman Regularization and the lasso. A bit of background If l 2 was the norm of the 20th century, then l is
More informationSparse Optimization Lecture: Basic Sparse Optimization Models
Sparse Optimization Lecture: Basic Sparse Optimization Models Instructor: Wotao Yin July 2013 online discussions on piazza.com Those who complete this lecture will know basic l 1, l 2,1, and nuclear-norm
More informationTree-Structured Compressive Sensing with Variational Bayesian Analysis
1 Tree-Structured Compressive Sensing with Variational Bayesian Analysis Lihan He, Haojun Chen and Lawrence Carin Department of Electrical and Computer Engineering Duke University Durham, NC 27708-0291
More informationarxiv: v1 [math.oc] 30 Apr 2016
The Homotopy Method Revisited: Computing Solution Paths of l -Regularized Problems Björn Bringmann, Daniel Cremers, Felix Krahmer, Michael Möller arxiv:605.0007v [math.oc] 30 Apr 206 May 3, 206 Abstract
More informationA Greedy Search Algorithm with Tree Pruning for Sparse Signal Recovery
A Greedy Search Algorithm with Tree Pruning for Sparse Signal Recovery Jaeseok Lee, Suhyuk Kwon, and Byonghyo Shim Information System Laboratory School of Infonnation and Communications, Korea University
More informationwhere κ is a gauge function. The majority of the solvers available for the problems under consideration work with the penalized version,
SPARSE OPTIMIZATION WITH LEAST-SQUARES CONSTRAINTS ON-LINE APPENDIX EWOUT VAN DEN BERG AND MICHAEL P. FRIEDLANDER Appendix A. Numerical experiments. This on-line appendix accompanies the paper Sparse optimization
More informationInverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.
Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Overview of the course Introduction sparsity & data compression inverse problems
More informationA Continuation Approach to Estimate a Solution Path of Mixed L2-L0 Minimization Problems
A Continuation Approach to Estimate a Solution Path of Mixed L2-L Minimization Problems Junbo Duan, Charles Soussen, David Brie, Jérôme Idier Centre de Recherche en Automatique de Nancy Nancy-University,
More informationPre-weighted Matching Pursuit Algorithms for Sparse Recovery
Journal of Information & Computational Science 11:9 (214) 2933 2939 June 1, 214 Available at http://www.joics.com Pre-weighted Matching Pursuit Algorithms for Sparse Recovery Jingfei He, Guiling Sun, Jie
More information1-Bit Compressive Sensing
1-Bit Compressive Sensing Petros T. Boufounos, Richard G. Baraniuk Rice University, Electrical and Computer Engineering 61 Main St. MS 38, Houston, TX 775 Abstract Compressive sensing is a new signal acquisition
More informationImproved FOCUSS Method With Conjugate Gradient Iterations
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 57, NO. 1, JANUARY 2009 399 Improved FOCUSS Method With Conjugate Gradient Iterations Zhaoshui He, Andrzej Cichocki, Rafal Zdunek, and Shengli Xie Abstract
More informationarxiv: v2 [math.st] 12 Feb 2008
arxiv:080.460v2 [math.st] 2 Feb 2008 Electronic Journal of Statistics Vol. 2 2008 90 02 ISSN: 935-7524 DOI: 0.24/08-EJS77 Sup-norm convergence rate and sign concentration property of Lasso and Dantzig
More informationSOLVING NON-CONVEX LASSO TYPE PROBLEMS WITH DC PROGRAMMING. Gilles Gasso, Alain Rakotomamonjy and Stéphane Canu
SOLVING NON-CONVEX LASSO TYPE PROBLEMS WITH DC PROGRAMMING Gilles Gasso, Alain Rakotomamonjy and Stéphane Canu LITIS - EA 48 - INSA/Universite de Rouen Avenue de l Université - 768 Saint-Etienne du Rouvray
More informationA convex QP solver based on block-lu updates
Block-LU updates p. 1/24 A convex QP solver based on block-lu updates PP06 SIAM Conference on Parallel Processing for Scientific Computing San Francisco, CA, Feb 22 24, 2006 Hanh Huynh and Michael Saunders
More informationSparse & Redundant Signal Representation, and its Role in Image Processing
Sparse & Redundant Signal Representation, and its Role in Michael Elad The CS Department The Technion Israel Institute of technology Haifa 3000, Israel Wave 006 Wavelet and Applications Ecole Polytechnique
More informationof Orthogonal Matching Pursuit
A Sharp Restricted Isometry Constant Bound of Orthogonal Matching Pursuit Qun Mo arxiv:50.0708v [cs.it] 8 Jan 205 Abstract We shall show that if the restricted isometry constant (RIC) δ s+ (A) of the measurement
More informationBlock-sparse Solutions using Kernel Block RIP and its Application to Group Lasso
Block-sparse Solutions using Kernel Block RIP and its Application to Group Lasso Rahul Garg IBM T.J. Watson research center grahul@us.ibm.com Rohit Khandekar IBM T.J. Watson research center rohitk@us.ibm.com
More informationRecovering sparse signals with non-convex. DC programming
Recovering sparse signals with non-convex penalties and DC programming Gilles Gasso, Alain Rakotomamonjy, Stéphane Canu To cite this version: Gilles Gasso, Alain Rakotomamonjy, Stéphane Canu. penalties
More informationSparse Solutions of Systems of Equations and Sparse Modelling of Signals and Images
Sparse Solutions of Systems of Equations and Sparse Modelling of Signals and Images Alfredo Nava-Tudela ant@umd.edu John J. Benedetto Department of Mathematics jjb@umd.edu Abstract In this project we are
More informationTECHNICAL REPORT NO. 1091r. A Note on the Lasso and Related Procedures in Model Selection
DEPARTMENT OF STATISTICS University of Wisconsin 1210 West Dayton St. Madison, WI 53706 TECHNICAL REPORT NO. 1091r April 2004, Revised December 2004 A Note on the Lasso and Related Procedures in Model
More informationABSTRACT. Recovering Data with Group Sparsity by Alternating Direction Methods. Wei Deng
ABSTRACT Recovering Data with Group Sparsity by Alternating Direction Methods by Wei Deng Group sparsity reveals underlying sparsity patterns and contains rich structural information in data. Hence, exploiting
More informationarxiv: v3 [cs.it] 7 Nov 2013
Compressed Sensing with Linear Correlation Between Signal and Measurement Noise arxiv:1301.0213v3 [cs.it] 7 Nov 2013 Abstract Thomas Arildsen a, and Torben Larsen b Aalborg University, Faculty of Engineering
More informationLasso applications: regularisation and homotopy
Lasso applications: regularisation and homotopy M.R. Osborne 1 mailto:mike.osborne@anu.edu.au 1 Mathematical Sciences Institute, Australian National University Abstract Ti suggested the use of an l 1 norm
More informationSignal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit
Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit Deanna Needell and Roman Vershynin Abstract We demonstrate a simple greedy algorithm that can reliably
More informationFast Regularization Paths via Coordinate Descent
KDD August 2008 Trevor Hastie, Stanford Statistics 1 Fast Regularization Paths via Coordinate Descent Trevor Hastie Stanford University joint work with Jerry Friedman and Rob Tibshirani. KDD August 2008
More informationCompressive Sensing and Beyond
Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered
More informationFast Regularization Paths via Coordinate Descent
user! 2009 Trevor Hastie, Stanford Statistics 1 Fast Regularization Paths via Coordinate Descent Trevor Hastie Stanford University joint work with Jerome Friedman and Rob Tibshirani. user! 2009 Trevor
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationProjection onto an l 1 -norm Ball with Application to Identification of Sparse Autoregressive Models
Projection onto an l 1 -norm Ball with Application to Identification of Sparse Autoregressive Models Jitkomut Songsiri Department of Electrical Engineering Faculty of Engineering, Chulalongkorn University
More informationBayesian Compressive Sensing
Bayesian Compressive Sensing Shihao Ji, Ya Xue, and Lawrence Carin Department of Electrical and Computer Engineering Duke University, Durham, NC 2778-29 USA {shji,yx,lcarin}@ece.duke.edu EDICS: DSP-RECO,
More informationSCIENCE CHINA Information Sciences. Received December 22, 2008; accepted February 26, 2009; published online May 8, 2010
. RESEARCH PAPERS. SCIENCE CHINA Information Sciences June 2010 Vol. 53 No. 6: 1159 1169 doi: 10.1007/s11432-010-0090-0 L 1/2 regularization XU ZongBen 1, ZHANG Hai 1,2, WANG Yao 1, CHANG XiangYu 1 & LIANG
More informationA significance test for the lasso
1 First part: Joint work with Richard Lockhart (SFU), Jonathan Taylor (Stanford), and Ryan Tibshirani (Carnegie-Mellon Univ.) Second part: Joint work with Max Grazier G Sell, Stefan Wager and Alexandra
More informationData Sparse Matrix Computation - Lecture 20
Data Sparse Matrix Computation - Lecture 20 Yao Cheng, Dongping Qi, Tianyi Shi November 9, 207 Contents Introduction 2 Theorems on Sparsity 2. Example: A = [Φ Ψ]......................... 2.2 General Matrix
More informationA tutorial on sparse modeling. Outline:
A tutorial on sparse modeling. Outline: 1. Why? 2. What? 3. How. 4. no really, why? Sparse modeling is a component in many state of the art signal processing and machine learning tasks. image processing
More informationBhaskar Rao Department of Electrical and Computer Engineering University of California, San Diego
Bhaskar Rao Department of Electrical and Computer Engineering University of California, San Diego 1 Outline Course Outline Motivation for Course Sparse Signal Recovery Problem Applications Computational
More informationIMA Preprint Series # 2276
UNIVERSAL PRIORS FOR SPARSE MODELING By Ignacio Ramírez Federico Lecumberry and Guillermo Sapiro IMA Preprint Series # 2276 ( August 2009 ) INSTITUTE FOR MATHEMATICS AND ITS APPLICATIONS UNIVERSITY OF
More informationApproximate Message Passing
Approximate Message Passing Mohammad Emtiyaz Khan CS, UBC February 8, 2012 Abstract In this note, I summarize Sections 5.1 and 5.2 of Arian Maleki s PhD thesis. 1 Notation We denote scalars by small letters
More informationGYULA FARKAS WOULD ALSO FEEL PROUD
GYULA FARKAS WOULD ALSO FEEL PROUD PABLO GUERRERO-GARCÍA a, ÁNGEL SANTOS-PALOMO a a Department of Applied Mathematics, University of Málaga, 29071 Málaga, Spain (Received 26 September 2003; In final form
More informationAlgorithms for sparse optimization
Algorithms for sparse optimization Michael P. Friedlander University of California, Davis Google May 19, 2015 Sparse representations = + y = Hx y = Dx y = [ H D ] x 20 40 60 80 100 20 40 60 80 100 50 100
More informationIEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 58, NO. 6, JUNE
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 58, NO. 6, JUNE 2010 2935 Variance-Component Based Sparse Signal Reconstruction and Model Selection Kun Qiu, Student Member, IEEE, and Aleksandar Dogandzic,
More informationTopographic Dictionary Learning with Structured Sparsity
Topographic Dictionary Learning with Structured Sparsity Julien Mairal 1 Rodolphe Jenatton 2 Guillaume Obozinski 2 Francis Bach 2 1 UC Berkeley 2 INRIA - SIERRA Project-Team San Diego, Wavelets and Sparsity
More informationCompressive Sensing Theory and L1-Related Optimization Algorithms
Compressive Sensing Theory and L1-Related Optimization Algorithms Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, USA CAAM Colloquium January 26, 2009 Outline:
More informationA new method on deterministic construction of the measurement matrix in compressed sensing
A new method on deterministic construction of the measurement matrix in compressed sensing Qun Mo 1 arxiv:1503.01250v1 [cs.it] 4 Mar 2015 Abstract Construction on the measurement matrix A is a central
More informationA Fixed-Point Continuation Method for l 1 -Regularized Minimization with Applications to Compressed Sensing
CAAM Technical Report TR07-07 A Fixed-Point Continuation Method for l 1 -Regularized Minimization with Applications to Compressed Sensing Elaine T. Hale, Wotao Yin, and Yin Zhang Department of Computational
More information10-725/ Optimization Midterm Exam
10-725/36-725 Optimization Midterm Exam November 6, 2012 NAME: ANDREW ID: Instructions: This exam is 1hr 20mins long Except for a single two-sided sheet of notes, no other material or discussion is permitted
More informationAcommon problem in signal processing is to estimate an
5758 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 12, DECEMBER 2009 Necessary and Sufficient Conditions for Sparsity Pattern Recovery Alyson K. Fletcher, Member, IEEE, Sundeep Rangan, and Vivek
More informationA complex version of the LASSO algorithm and its application to beamforming
The 7 th International Telecommunications Symposium ITS 21) A complex version of the LASSO algorithm and its application to beamforming José F. de Andrade Jr. and Marcello L. R. de Campos Program of Electrical
More informationA direct formulation for sparse PCA using semidefinite programming
A direct formulation for sparse PCA using semidefinite programming A. d Aspremont, L. El Ghaoui, M. Jordan, G. Lanckriet ORFE, Princeton University & EECS, U.C. Berkeley Available online at www.princeton.edu/~aspremon
More information