Model Selection with Partly Smooth Functions
|
|
- Nora Robbins
- 5 years ago
- Views:
Transcription
1 Model Selection with Partly Smooth Functions Samuel Vaiter, Gabriel Peyré and Jalal Fadili August 27, 2014 ITWIST 14 Model Consistency of Partly Smooth Regularizers, arxiv: ,
2 Linear Inverse Problems Forward model y = Φ x 0 + w Forward operator Φ : R n R q linear (q n) 2
3 Linear Inverse Problems Forward model y = Φ x 0 + w Forward operator Φ : R n R q linear (q n) ill-posed problem 2
4 Linear Inverse Problems Forward model y = Φ x 0 + w Forward operator Φ : R n R q linear (q n) ill-posed problem denoising inpainting deblurring 2
5 Variational Regularization Trade-off between prior regularization and data fidelity 3
6 Variational Regularization Trade-off between prior regularization and data fidelity x Argmin J(x) + x R n 1 2λ y Φx 2 (P y,λ ) 3
7 Variational Regularization Trade-off between prior regularization and data fidelity x Argmin J(x) + x R n 1 2λ y Φx 2 (P y,λ ) λ 0 + x Argmin J(x) subject to y = Φx (P y,0 ) x R n 3
8 Variational Regularization Trade-off between prior regularization and data fidelity x Argmin J(x) + x R n 1 2λ y Φx 2 (P y,λ ) λ 0 + x Argmin J(x) subject to y = Φx (P y,0 ) x R n J convex, bounded from below and finite-valued function, typically non-smooth. 3
9 Objective x 0 y x 4
10 Low Complexity Models Sparsity J(x) = X xi i=1,...,n Mx = x 0 : supp(x 0 ) supp(x) 5
11 Low Complexity Models Sparsity J(x) = X xi Group sparsity X J(x) = xb i=1,...,n b B Mx = x 0 : supp(x 0 ) supp(x) 5
12 Low Complexity Models Sparsity J(x) = X xi Group sparsity X J(x) = xb i=1,...,n Mx = x 0 : supp(x 0 ) supp(x) b B Low rank J(x) = X σi (x) i=1,...,n Mx = x 0 : rank(x 0 ) = rank(x) 5
13 Partly Smooth Functions [Lewis 2002] T M x x M J is partly smooth at x relative to a C 2 -manifold M if Smoothness. J restricted to M is C 2 around x Sharpness. h (T M x), t J(x + th) is non-smooth at t = 0. Continuity. J on M is continuous around x. 6
14 Partly Smooth Functions [Lewis 2002] T M x x M J, G partly smooth J is partly smooth at x relative to a C 2 -manifold M if Smoothness. J restricted to M is C 2 around x Sharpness. h (T M x), t J(x + th) is non-smooth at t = 0. Continuity. J on M is continuous around x. J + G J D with D linear operator partly smooth J σ (spectral lift) 1, 1, 1,2,,, max i ( d i, x ) + partly smooth. 6
15 Dual Certificates x Argmin J(x) subject to y = Φx (P y,0 ) x R n 7
16 Dual Certificates Source condition x Argmin J(x) subject to y = Φx (P y,0 ) x R n Φ p J(x) J(x) Φ x p Φx = Φx 0 7
17 Dual Certificates Source condition x Argmin J(x) subject to y = Φx (P y,0 ) x R n Φ p J(x) J(x) Φ x p Φx = Φx 0 Proposition There exists a dual certificate p if, and only if, x 0 is a solution of (P y,0 ). 7
18 Dual Certificates x Argmin J(x) subject to y = Φx (P y,0 ) x R n Source condition Φ p J(x) Non-degenerate source condition Φ p ri J(x) J(x) Φ x p Φx = Φx 0 Proposition There exists a dual certificate p if, and only if, x 0 is a solution of (P y,0 ). 7
19 Linearized Precertificate Minimal norm certificate p 0 = argmin p subject to Φ p J(x 0 ) 8
20 Linearized Precertificate Minimal norm certificate p 0 = argmin p subject to Φ p J(x 0 ) Linearized precertificate p F = argmin p subject to Φ p aff J(x 0 ) 8
21 Linearized Precertificate Minimal norm certificate p 0 = argmin p subject to Φ p J(x 0 ) Linearized precertificate p F = argmin p subject to Φ p aff J(x 0 ) Proposition Assume Ker Φ T M x 0 = {0}. Then, p F ri J(x 0 ) p F = p 0 8
22 Manifold Selection Theorem Assume J is partly smooth at x 0 relative to M. If Φ p F ri J(x 0 ) and Ker Φ T M x 0 = {0}. There exists C > 0 such that if max(λ, w /λ) C, the unique solution x of (P y,λ ) satisfies x M and x x 0 = O( w ). 9
23 Manifold Selection Theorem Assume J is partly smooth at x 0 relative to M. If Φ p F ri J(x 0 ) and Ker Φ T M x 0 = {0}. There exists C > 0 such that if max(λ, w /λ) C, the unique solution x of (P y,λ ) satisfies x M and x x 0 = O( w ). Almost sharp analysis (Φ p F J(x 0 ) x M x0 ) 9
24 Manifold Selection Theorem Assume J is partly smooth at x 0 relative to M. If Φ p F ri J(x 0 ) and Ker Φ T M x 0 = {0}. There exists C > 0 such that if max(λ, w /λ) C, the unique solution x of (P y,λ ) satisfies x M and x x 0 = O( w ). Almost sharp analysis (Φ p F J(x 0 ) x M x0 ) [Fuchs 2004]: l 1 [Bach 2008]: l 1 l 2 and nuclear norm. 9
25 Sparse Spike Deconvolution x 0 10
26 Sparse Spike Deconvolution Φx = i x i ϕ( i) J(x) = x 1 γ Φx 0 x 0 10
27 Sparse Spike Deconvolution Φx = i x i ϕ( i) J(x) = x 1 γ Φx 0 x 0 Φ η F ri J(x 0 ) Φ +, I Φ c I sign(x 0,I ) < 1 stable recovery I = supp(x 0 ) η 0,I c 1 γ crit γ 10
28 1D Total Variation and Jump Set J = d 1, M x = { x : supp( d x ) supp( d x) }, Φ = Id 11
29 1D Total Variation and Jump Set J = d 1, M x = { x : supp( d x ) supp( d x) }, Φ = Id x i u k i k +1 1 stable jump unstable jump Φ p F = div u 11
30 Take-away Message Partial smoothness: encodes models using singularities 12
31 Future Work Extended-valued functions: minimization under constraints min x R n 1 2 y Φx 2 + λj(x) subject to x 0 13
32 Future Work Extended-valued functions: minimization under constraints 1 min x R n 2 y Φx 2 + λj(x) subject to x 0 Non-convexity: Fidelity and regularization, dictionary learning min x k R n,d D k 1 2 y ΦDx k 2 + λj(x k ) 13
33 Future Work Extended-valued functions: minimization under constraints 1 min x R n 2 y Φx 2 + λj(x) subject to x 0 Non-convexity: Fidelity and regularization, dictionary learning min x k R n,d D k 1 2 y ΦDx k 2 + λj(x k ) Infinite dimensional problems: partial smoothness for BV, Besov 1 min f BV(Ω) L 2 (Ω) 2 g Ψf L 2 (Ω) + λ Df (Ω) 13
34 Future Work Extended-valued functions: minimization under constraints 1 min x R n 2 y Φx 2 + λj(x) subject to x 0 Non-convexity: Fidelity and regularization, dictionary learning min x k R n,d D k 1 2 y ΦDx k 2 + λj(x k ) Infinite dimensional problems: partial smoothness for BV, Besov 1 min f BV(Ω) L 2 (Ω) 2 g Ψf L 2 (Ω) + λ Df (Ω) Compressed sensing: Optimal bounds for partly smooth regularizers 13
35 Thanks for your attention 14
Sparsity and Compressed Sensing
Sparsity and Compressed Sensing Jalal Fadili Normandie Université-ENSICAEN, GREYC Mathematical coffees 2017 Recap: linear inverse problems Dictionary Sensing Sensing Sensing = y m 1 H m n y y 2 R m H A
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationarxiv: v3 [math.oc] 15 Sep 2014
Exact Support Recovery for Sparse Spikes Deconvolution arxiv:136.699v3 [math.oc] 15 Sep 214 Vincent Duval and Gabriel Peyré CNRS and Université Paris-Dauphine {vincent.duval,gabriel.peyre}@ceremade.dauphine.fr
More informationActivity Identification and Local Linear Convergence of Forward Backward-type methods
Activity Identification and Local Linear Convergence of Forward Backward-type methods Jingwei Liang Jalal M. Fadili Gabriel Peyré Abstract. In this paper, we consider a class of Forward Backward FB splitting
More informationSparse Regularization on Thin Grids I: the LASSO
Sparse Regularization on Thin Grids I: the LASSO Vincent Duval, Gabriel Peyré To cite this version: Vincent Duval, Gabriel Peyré. Sparse Regularization on Thin Grids I: the LASSO. Inverse Problems, IOP
More informationExact Support Recovery for Sparse Spikes Deconvolution
Exact Support Recovery for Sparse Spikes Deconvolution Vincent Duval, Gabriel Peyré To cite this version: Vincent Duval, Gabriel Peyré. Exact Support Recovery for Sparse Spikes Deconvolution. 2013.
More informationSparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28
Sparsity Models Tong Zhang Rutgers University T. Zhang (Rutgers) Sparsity Models 1 / 28 Topics Standard sparse regression model algorithms: convex relaxation and greedy algorithm sparse recovery analysis:
More informationROBUST BLIND SPIKES DECONVOLUTION. Yuejie Chi. Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 43210
ROBUST BLIND SPIKES DECONVOLUTION Yuejie Chi Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 4 ABSTRACT Blind spikes deconvolution, or blind super-resolution, deals with
More informationGauge optimization and duality
1 / 54 Gauge optimization and duality Junfeng Yang Department of Mathematics Nanjing University Joint with Shiqian Ma, CUHK September, 2015 2 / 54 Outline Introduction Duality Lagrange duality Fenchel
More informationGeneralized greedy algorithms.
Generalized greedy algorithms. François-Xavier Dupé & Sandrine Anthoine LIF & I2M Aix-Marseille Université - CNRS - Ecole Centrale Marseille, Marseille ANR Greta Séminaire Parisien des Mathématiques Appliquées
More informationLeast Squares and Linear Systems
Least Squares and Linear Systems Gabriel Peyré www.numerical-tours.com ÉCOLE NORMALE SUPÉRIEURE s=3 s=6 0.5 0.5 0 0 0.5 0.5 https://mathematical-coffees.github.io 1 10 20 30 40 50 Organized by: Mérouane
More informationRapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization
Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization Shuyang Ling Department of Mathematics, UC Davis Oct.18th, 2016 Shuyang Ling (UC Davis) 16w5136, Oaxaca, Mexico Oct.18th, 2016
More informationECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis
ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 3: Sparse signal recovery: A RIPless analysis of l 1 minimization Yuejie Chi The Ohio State University Page 1 Outline
More informationProvable Alternating Minimization Methods for Non-convex Optimization
Provable Alternating Minimization Methods for Non-convex Optimization Prateek Jain Microsoft Research, India Joint work with Praneeth Netrapalli, Sujay Sanghavi, Alekh Agarwal, Animashree Anandkumar, Rashish
More informationRegularization and Inverse Problems
Regularization and Inverse Problems Caroline Sieger Host Institution: Universität Bremen Home Institution: Clemson University August 5, 2009 Caroline Sieger (Bremen and Clemson) Regularization and Inverse
More informationECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis
ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear
More informationOne condition for all: solution uniqueness and robustness of l 1 -synthesis and l 1 -analysis minimizations
One condition for all: solution uniqueness and robustness of l 1 -synthesis and l 1 -analysis minimizations Hui Zhang Ming Yan Wotao Yin June 9, 2013 The l 1-synthesis and l 1-analysis models recover structured
More informationSparse Proteomics Analysis (SPA)
Sparse Proteomics Analysis (SPA) Toward a Mathematical Theory for Feature Selection from Forward Models Martin Genzel Technische Universität Berlin Winter School on Compressed Sensing December 5, 2015
More informationLinear Inverse Problems
Linear Inverse Problems Ajinkya Kadu Utrecht University, The Netherlands February 26, 2018 Outline Introduction Least-squares Reconstruction Methods Examples Summary Introduction 2 What are inverse problems?
More informationInverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.
Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Overview of the course Introduction sparsity & data compression inverse problems
More informationarxiv: v3 [math.oc] 27 Oct 2016
A Multi-step Inertial Forward Backward Splitting Method for Non-convex Optimization arxiv:1606.02118v3 [math.oc] 27 Oct 2016 Jingwei Liang Jalal M. Fadili Gabriel Peyré Abstract In this paper, we propose
More informationContinuous Primal-Dual Methods in Image Processing
Continuous Primal-Dual Methods in Image Processing Michael Goldman CMAP, Polytechnique August 2012 Introduction Maximal monotone operators Application to the initial problem Numerical illustration Introduction
More informationRégularisations de Faible Complexité pour les Problèmes Inverses. Low Complexity Regularizations of Inverse Problems
U nversit é Paris-Dauphine École Doctorale de Dauphine Régularisations de Faible Complexité pour les Problèmes Inverses Low Complexity Regularizations of Inverse Problems Thèse Pour l obtention du titre
More informationConstrained optimization
Constrained optimization DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Compressed sensing Convex constrained
More informationOslo Class 6 Sparsity based regularization
RegML2017@SIMULA Oslo Class 6 Sparsity based regularization Lorenzo Rosasco UNIGE-MIT-IIT May 4, 2017 Learning from data Possible only under assumptions regularization min Ê(w) + λr(w) w Smoothness Sparsity
More informationMathematical introduction to Compressed Sensing
Mathematical introduction to Compressed Sensing Lesson 1 : measurements and sparsity Guillaume Lecué ENSAE Mardi 31 janvier 2016 Guillaume Lecué (ENSAE) Compressed Sensing Mardi 31 janvier 2016 1 / 31
More informationThe degrees of freedom of the Group Lasso for a General Design
The degrees of freedom of the Group Lasso for a General Design Samuel Vaiter, Charles Deledalle, Gabriel Peyré, Jalal M. Fadili, Charles Dossal To cite this version: Samuel Vaiter, Charles Deledalle, Gabriel
More informationCSC 576: Variants of Sparse Learning
CSC 576: Variants of Sparse Learning Ji Liu Department of Computer Science, University of Rochester October 27, 205 Introduction Our previous note basically suggests using l norm to enforce sparsity in
More informationThe degrees of freedom of the Lasso for general design matrix
The degrees of freedom of the Lasso for general design matrix C. Dossal (1) M. Kachour (2), M.J. Fadili (2), G. Peyré (3) and C. Chesneau (4) (1) IMB, CNRS-Univ. Bordeaux 1 351 Cours de la Libération,
More informationOWL to the rescue of LASSO
OWL to the rescue of LASSO IISc IBM day 2018 Joint Work R. Sankaran and Francis Bach AISTATS 17 Chiranjib Bhattacharyya Professor, Department of Computer Science and Automation Indian Institute of Science,
More informationThree Generalizations of Compressed Sensing
Thomas Blumensath School of Mathematics The University of Southampton June, 2010 home prev next page Compressed Sensing and beyond y = Φx + e x R N or x C N x K is K-sparse and x x K 2 is small y R M or
More informationDictionary Learning for photo-z estimation
Dictionary Learning for photo-z estimation Joana Frontera-Pons, Florent Sureau, Jérôme Bobin 5th September 2017 - Workshop Dictionary Learning on Manifolds MOTIVATION! Goal : Measure the radial positions
More informationInverse Problems meets Statistical Learning
Inverse Problems meets Statistical Learning Gabriel Peyré www.numerical-tours.com ÉCOLE NORMALE SUPÉRIEURE s=3 s=6 0.5 0.5 0 0 0.5 0.5 https://mathematical-coffees.github.io 1 10 20 30 40 50 Organized
More informationThe degrees of freedom of partly smooth regularizers
The degrees of freedom of partly smooth regularizers Samuel Vaiter, Charles-Alban Deledalle, Jalal M. Fadili, Gabriel Peyré, Charles Dossal To cite this version: Samuel Vaiter, Charles-Alban Deledalle,
More informationRobust Sparse Analysis Regularization
Robust Sparse Analysis Regularization Samuel Vaiter, Gabriel Peyré, Charles Dossal and Jalal Fadili arxiv:09.6222v5 [cs.it] 2 Oct 202 Abstract This paper investigates the theoretical guarantees of l -analysis
More informationRobust multichannel sparse recovery
Robust multichannel sparse recovery Esa Ollila Department of Signal Processing and Acoustics Aalto University, Finland SUPELEC, Feb 4th, 2015 1 Introduction 2 Nonparametric sparse recovery 3 Simulation
More informationOverview. Optimization-Based Data Analysis. Carlos Fernandez-Granda
Overview Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 1/25/2016 Sparsity Denoising Regression Inverse problems Low-rank models Matrix completion
More informationSelf-Calibration and Biconvex Compressive Sensing
Self-Calibration and Biconvex Compressive Sensing Shuyang Ling Department of Mathematics, UC Davis July 12, 2017 Shuyang Ling (UC Davis) SIAM Annual Meeting, 2017, Pittsburgh July 12, 2017 1 / 22 Acknowledgements
More informationComposite Objective Mirror Descent
Composite Objective Mirror Descent John C. Duchi 1,3 Shai Shalev-Shwartz 2 Yoram Singer 3 Ambuj Tewari 4 1 University of California, Berkeley 2 Hebrew University of Jerusalem, Israel 3 Google Research
More information1 Sparsity and l 1 relaxation
6.883 Learning with Combinatorial Structure Note for Lecture 2 Author: Chiyuan Zhang Sparsity and l relaxation Last time we talked about sparsity and characterized when an l relaxation could recover the
More informationSpectral k-support Norm Regularization
Spectral k-support Norm Regularization Andrew McDonald Department of Computer Science, UCL (Joint work with Massimiliano Pontil and Dimitris Stamos) 25 March, 2015 1 / 19 Problem: Matrix Completion Goal:
More informationSparse Optimization Lecture: Dual Certificate in l 1 Minimization
Sparse Optimization Lecture: Dual Certificate in l 1 Minimization Instructor: Wotao Yin July 2013 Note scriber: Zheng Sun Those who complete this lecture will know what is a dual certificate for l 1 minimization
More informationSélection adaptative des paramètres pour le débruitage des images
Journées SIERRA 2014, Saint-Etienne, France, 25 mars, 2014 Sélection adaptative des paramètres pour le débruitage des images Adaptive selection of parameters for image denoising Charles Deledalle 1 Joint
More informationPrimal-dual algorithms for next-generation radio-interferometric imaging
1/17 Primal-dual algorithms for next-generation radio-interferometric imaging Alexandru Onose 1,RafaelE.Carrillo 2,AbdullahAbdulaziz 1, Arwa Dabbech 1,JasonD.McEwen 3 and Yves Wiaux 1 1 Institute of Sensors,
More informationRecovery Guarantees for Rank Aware Pursuits
BLANCHARD AND DAVIES: RECOVERY GUARANTEES FOR RANK AWARE PURSUITS 1 Recovery Guarantees for Rank Aware Pursuits Jeffrey D. Blanchard and Mike E. Davies Abstract This paper considers sufficient conditions
More informationBlind Identification of Invertible Graph Filters with Multiple Sparse Inputs 1
Blind Identification of Invertible Graph Filters with Multiple Sparse Inputs Chang Ye Dept. of ECE and Goergen Institute for Data Science University of Rochester cye7@ur.rochester.edu http://www.ece.rochester.edu/~cye7/
More informationSparsity in system identification and data-driven control
1 / 40 Sparsity in system identification and data-driven control Ivan Markovsky This signal is not sparse in the "time domain" 2 / 40 But it is sparse in the "frequency domain" (it is weighted sum of six
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE7C (Spring 08): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee7c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee7c@berkeley.edu October
More informationSparsity and Morphological Diversity in Source Separation. Jérôme Bobin IRFU/SEDI-Service d Astrophysique CEA Saclay - France
Sparsity and Morphological Diversity in Source Separation Jérôme Bobin IRFU/SEDI-Service d Astrophysique CEA Saclay - France Collaborators - Yassir Moudden - CEA Saclay, France - Jean-Luc Starck - CEA
More informationc 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE
METHODS AND APPLICATIONS OF ANALYSIS. c 2011 International Press Vol. 18, No. 1, pp. 105 110, March 2011 007 EXACT SUPPORT RECOVERY FOR LINEAR INVERSE PROBLEMS WITH SPARSITY CONSTRAINTS DENNIS TREDE Abstract.
More informationConvex relaxation for Combinatorial Penalties
Convex relaxation for Combinatorial Penalties Guillaume Obozinski Equipe Imagine Laboratoire d Informatique Gaspard Monge Ecole des Ponts - ParisTech Joint work with Francis Bach Fête Parisienne in Computation,
More informationSparse Solutions of an Undetermined Linear System
1 Sparse Solutions of an Undetermined Linear System Maddullah Almerdasy New York University Tandon School of Engineering arxiv:1702.07096v1 [math.oc] 23 Feb 2017 Abstract This work proposes a research
More informationIterative regularization of nonlinear ill-posed problems in Banach space
Iterative regularization of nonlinear ill-posed problems in Banach space Barbara Kaltenbacher, University of Klagenfurt joint work with Bernd Hofmann, Technical University of Chemnitz, Frank Schöpfer and
More informationMassive MIMO: Signal Structure, Efficient Processing, and Open Problems II
Massive MIMO: Signal Structure, Efficient Processing, and Open Problems II Mahdi Barzegar Communications and Information Theory Group (CommIT) Technische Universität Berlin Heisenberg Communications and
More informationGROUP SPARSITY WITH OVERLAPPING PARTITION FUNCTIONS
GROUP SPARSITY WITH OVERLAPPING PARTITION FUNCTIONS Gabriel Peyré, Jalal Fadili 2 Ceremade CNRS-Univ. Paris-Dauphine, France Gabriel.Peyre@ceremade.dauphine.fr 2 GREYC CNRS-ENSICAEN-Univ. Caen, France
More informationReconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm
Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm Jeevan K. Pant, Wu-Sheng Lu, and Andreas Antoniou University of Victoria May 21, 2012 Compressive Sensing 1/23
More informationEUSIPCO
EUSIPCO 013 1569746769 SUBSET PURSUIT FOR ANALYSIS DICTIONARY LEARNING Ye Zhang 1,, Haolong Wang 1, Tenglong Yu 1, Wenwu Wang 1 Department of Electronic and Information Engineering, Nanchang University,
More informationINDUSTRIAL MATHEMATICS INSTITUTE. B.S. Kashin and V.N. Temlyakov. IMI Preprint Series. Department of Mathematics University of South Carolina
INDUSTRIAL MATHEMATICS INSTITUTE 2007:08 A remark on compressed sensing B.S. Kashin and V.N. Temlyakov IMI Preprint Series Department of Mathematics University of South Carolina A remark on compressed
More informationLINEARIZED BREGMAN ITERATIONS FOR FRAME-BASED IMAGE DEBLURRING
LINEARIZED BREGMAN ITERATIONS FOR FRAME-BASED IMAGE DEBLURRING JIAN-FENG CAI, STANLEY OSHER, AND ZUOWEI SHEN Abstract. Real images usually have sparse approximations under some tight frame systems derived
More informationIntroduction to Compressed Sensing
Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral
More informationBeyond incoherence and beyond sparsity: compressed sensing in the real world
Beyond incoherence and beyond sparsity: compressed sensing in the real world Clarice Poon 1st November 2013 University of Cambridge, UK Applied Functional and Harmonic Analysis Group Head of Group Anders
More informationEE 381V: Large Scale Optimization Fall Lecture 24 April 11
EE 381V: Large Scale Optimization Fall 2012 Lecture 24 April 11 Lecturer: Caramanis & Sanghavi Scribe: Tao Huang 24.1 Review In past classes, we studied the problem of sparsity. Sparsity problem is that
More informationA Tutorial on Compressive Sensing. Simon Foucart Drexel University / University of Georgia
A Tutorial on Compressive Sensing Simon Foucart Drexel University / University of Georgia CIMPA13 New Trends in Applied Harmonic Analysis Mar del Plata, Argentina, 5-16 August 2013 This minicourse acts
More informationSparsifying Transform Learning for Compressed Sensing MRI
Sparsifying Transform Learning for Compressed Sensing MRI Saiprasad Ravishankar and Yoram Bresler Department of Electrical and Computer Engineering and Coordinated Science Laborarory University of Illinois
More informationSparse Approximation and Variable Selection
Sparse Approximation and Variable Selection Lorenzo Rosasco 9.520 Class 07 February 26, 2007 About this class Goal To introduce the problem of variable selection, discuss its connection to sparse approximation
More informationOn the recovery of measures without separation conditions
Norbert Wiener Center Department of Mathematics University of Maryland, College Park http://www.norbertwiener.umd.edu Applied and Computational Mathematics Seminar Georgia Institute of Technology October
More informationExponential Weighted Aggregation vs Penalized Estimation: Guarantees And Algorithms
Exponential Weighted Aggregation vs Penalized Estimation: Guarantees And Algorithms Jalal Fadili Normandie Université-ENSICAEN, GREYC CNRS UMR 6072 Joint work with Tùng Luu and Christophe Chesneau SPARS
More informationMIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications. Class 08: Sparsity Based Regularization. Lorenzo Rosasco
MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications Class 08: Sparsity Based Regularization Lorenzo Rosasco Learning algorithms so far ERM + explicit l 2 penalty 1 min w R d n n l(y
More informationUniqueness Conditions For Low-Rank Matrix Recovery
Claremont Colleges Scholarship @ Claremont CMC Faculty Publications and Research CMC Faculty Scholarship 3-28-2011 Uniqueness Conditions For Low-Rank Matrix Recovery Yonina C. Eldar Israel Institute of
More informationRecovering any low-rank matrix, provably
Recovering any low-rank matrix, provably Rachel Ward University of Texas at Austin October, 2014 Joint work with Yudong Chen (U.C. Berkeley), Srinadh Bhojanapalli and Sujay Sanghavi (U.T. Austin) Matrix
More informationEE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6)
EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement to the material discussed in
More informationSparse Signal Reconstruction with Hierarchical Decomposition
Sparse Signal Reconstruction with Hierarchical Decomposition Ming Zhong Advisor: Dr. Eitan Tadmor AMSC and CSCAMM University of Maryland College Park College Park, Maryland 20742 USA November 8, 2012 Abstract
More informationLecture Notes 10: Matrix Factorization
Optimization-based data analysis Fall 207 Lecture Notes 0: Matrix Factorization Low-rank models. Rank- model Consider the problem of modeling a quantity y[i, j] that depends on two indices i and j. To
More informationLeast Sparsity of p-norm based Optimization Problems with p > 1
Least Sparsity of p-norm based Optimization Problems with p > Jinglai Shen and Seyedahmad Mousavi Original version: July, 07; Revision: February, 08 Abstract Motivated by l p -optimization arising from
More informationSparse Recovery Beyond Compressed Sensing
Sparse Recovery Beyond Compressed Sensing Carlos Fernandez-Granda www.cims.nyu.edu/~cfgranda Applied Math Colloquium, MIT 4/30/2018 Acknowledgements Project funded by NSF award DMS-1616340 Separable Nonlinear
More informationRecovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm
Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm J. K. Pant, W.-S. Lu, and A. Antoniou University of Victoria August 25, 2011 Compressive Sensing 1 University
More informationAn Introduction to Compressed Sensing
An Introduction to Compressed Sensing Mathukumalli Vidyasagar March 8, 216 2 Contents 1 Compressed Sensing: Problem Formulation 5 1.1 Mathematical Preliminaries..................................... 5 1.2
More informationCompressed Sensing in Astronomy
Compressed Sensing in Astronomy J.-L. Starck CEA, IRFU, Service d'astrophysique, France jstarck@cea.fr http://jstarck.free.fr Collaborators: J. Bobin, CEA. Introduction: Compressed Sensing (CS) Sparse
More informationRecent Developments in Compressed Sensing
Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline
More informationA Simple Algorithm for Nuclear Norm Regularized Problems
A Simple Algorithm for Nuclear Norm Regularized Problems ICML 00 Martin Jaggi, Marek Sulovský ETH Zurich Matrix Factorizations for recommender systems Y = Customer Movie UV T = u () The Netflix challenge:
More informationECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference
ECE 18-898G: Special Topics in Signal Processing: Sparsity, Structure, and Inference Low-rank matrix recovery via nonconvex optimization Yuejie Chi Department of Electrical and Computer Engineering Spring
More informationProximal Methods for Optimization with Spasity-inducing Norms
Proximal Methods for Optimization with Spasity-inducing Norms Group Learning Presentation Xiaowei Zhou Department of Electronic and Computer Engineering The Hong Kong University of Science and Technology
More informationSubspace Projection Matrix Completion on Grassmann Manifold
Subspace Projection Matrix Completion on Grassmann Manifold Xinyue Shen and Yuantao Gu Dept. EE, Tsinghua University, Beijing, China http://gu.ee.tsinghua.edu.cn/ ICASSP 2015, Brisbane Contents 1 Background
More informationA memory gradient algorithm for l 2 -l 0 regularization with applications to image restoration
A memory gradient algorithm for l 2 -l 0 regularization with applications to image restoration E. Chouzenoux, A. Jezierska, J.-C. Pesquet and H. Talbot Université Paris-Est Lab. d Informatique Gaspard
More informationThe Sparsity Gap. Joel A. Tropp. Computing & Mathematical Sciences California Institute of Technology
The Sparsity Gap Joel A. Tropp Computing & Mathematical Sciences California Institute of Technology jtropp@acm.caltech.edu Research supported in part by ONR 1 Introduction The Sparsity Gap (Casazza Birthday
More informationVariational Image Restoration
Variational Image Restoration Yuling Jiao yljiaostatistics@znufe.edu.cn School of and Statistics and Mathematics ZNUFE Dec 30, 2014 Outline 1 1 Classical Variational Restoration Models and Algorithms 1.1
More informationTikhonov Regularization in General Form 8.1
Tikhonov Regularization in General Form 8.1 To introduce a more general formulation, let us return to the continuous formulation of the first-kind Fredholm integral equation. In this setting, the residual
More informationTopographic Dictionary Learning with Structured Sparsity
Topographic Dictionary Learning with Structured Sparsity Julien Mairal 1 Rodolphe Jenatton 2 Guillaume Obozinski 2 Francis Bach 2 1 UC Berkeley 2 INRIA - SIERRA Project-Team San Diego, Wavelets and Sparsity
More informationMCMC Sampling for Bayesian Inference using L1-type Priors
MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling
More informationEnhanced Compressive Sensing and More
Enhanced Compressive Sensing and More Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Nonlinear Approximation Techniques Using L1 Texas A & M University
More informationOptimisation Combinatoire et Convexe.
Optimisation Combinatoire et Convexe. Low complexity models, l 1 penalties. A. d Aspremont. M1 ENS. 1/36 Today Sparsity, low complexity models. l 1 -recovery results: three approaches. Extensions: matrix
More informationSuper-resolution via Convex Programming
Super-resolution via Convex Programming Carlos Fernandez-Granda (Joint work with Emmanuel Candès) Structure and Randomness in System Identication and Learning, IPAM 1/17/2013 1/17/2013 1 / 44 Index 1 Motivation
More informationSparse regression. Optimization-Based Data Analysis. Carlos Fernandez-Granda
Sparse regression Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 3/28/2016 Regression Least-squares regression Example: Global warming Logistic
More informationA Fast Augmented Lagrangian Algorithm for Learning Low-Rank Matrices
A Fast Augmented Lagrangian Algorithm for Learning Low-Rank Matrices Ryota Tomioka 1, Taiji Suzuki 1, Masashi Sugiyama 2, Hisashi Kashima 1 1 The University of Tokyo 2 Tokyo Institute of Technology 2010-06-22
More informationGeneralized Power Method for Sparse Principal Component Analysis
Generalized Power Method for Sparse Principal Component Analysis Peter Richtárik CORE/INMA Catholic University of Louvain Belgium VOCAL 2008, Veszprém, Hungary CORE Discussion Paper #2008/70 joint work
More informationNon-convex Robust PCA: Provable Bounds
Non-convex Robust PCA: Provable Bounds Anima Anandkumar U.C. Irvine Joint work with Praneeth Netrapalli, U.N. Niranjan, Prateek Jain and Sujay Sanghavi. Learning with Big Data High Dimensional Regime Missing
More informationIMAGE RESTORATION: TOTAL VARIATION, WAVELET FRAMES, AND BEYOND
IMAGE RESTORATION: TOTAL VARIATION, WAVELET FRAMES, AND BEYOND JIAN-FENG CAI, BIN DONG, STANLEY OSHER, AND ZUOWEI SHEN Abstract. The variational techniques (e.g., the total variation based method []) are
More informationRecent Advances in Structured Sparse Models
Recent Advances in Structured Sparse Models Julien Mairal Willow group - INRIA - ENS - Paris 21 September 2010 LEAR seminar At Grenoble, September 21 st, 2010 Julien Mairal Recent Advances in Structured
More informationSparse Parameter Estimation: Compressed Sensing meets Matrix Pencil
Sparse Parameter Estimation: Compressed Sensing meets Matrix Pencil Yuejie Chi Departments of ECE and BMI The Ohio State University Colorado School of Mines December 9, 24 Page Acknowledgement Joint work
More informationPrimal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector
Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Muhammad Salman Asif Thesis Committee: Justin Romberg (Advisor), James McClellan, Russell Mersereau School of Electrical and Computer
More information