Recent developments on sparse representation
|
|
- Leo Patterson
- 5 years ago
- Views:
Transcription
1 Recent developments on sparse representation Zeng Tieyong Department of Mathematics, Hong Kong Baptist University Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 1
2 Outline 1. Introduction 2. MP shrinkage algorithm 3. TV dictionary model 4. TV wavelet shrinkage 5. Conclusions Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 2
3 Background Research on sparse representation Dictionary learning task: given images, find dictionary methods Sparse representation task: given dictionary, for any image, find representation methods: Basis Pursuit, Matching Pursuit, Orthogonal Matching Pursuit Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 3
4 Mathematical framework Let H be a Hilbert space and the analyzed signal/image v H contains some noise: v = u + b, where u is clean image. Dictionary: a subsect D = {(ψ i ) i I } of H Atom: element in the dictionary, usually normalized Task of sparse representation: find a linear expansion, use atoms as few as possible, to approximate v Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 4
5 Background Various approaches have been proposed for the sparse representation problem, such as Basis Pursuit S. Chen, D. Donoho, and M. Saunders. Atomic Decomposition by Basis Pursuit, SIAM J. Sci. Comput., Vol. 20(1), pp (1998). Matching Pursuit S. Mallat and Z. Zhang. Matching Pursuits with Time-Frequency Dictionaries, IEEE Trans. Signal Process., Vol. 41(12), pp (1993). Orthogonal Matching Pursuit Y. Pati and R. Rezaiifar and P. Krishnaprasad. Orthogonal matching pursuit : Recursive function approximation with applications to wavelet decomposition, Proc. of 27th Asimolar Conf. on Signals, Systems and Computers, Los Alamitos (1993). Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 5
6 Basis Pursuit This model considers: min (λ i ) i I v i I where α is regularization parameter. λ i ψ i 2 + α i I λ i, advantages solvable comparing to the l 0 -norm minimization easy to be integrated into other variational model disadvantages difficult optimization task the tuning of parameter α is not straightforward Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 6
7 Matching Pursuit The MP approximates v by the iteration of decomposing the n-th residual R n v: Set R 0 v = v, n = 0. Iterate (loop in n) 1. find the best atom ψ γn by: 2. sub-decompose: γ n = arg sup i I R n v, ψ i ; R n+1 v = R n v R n v, ψ γn ψ γn. In applications, for M predefined, we can take the M-first terms as result u: u = R n v, ψ γn ψ γn. (1) M 1 n=0 Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 7
8 Remarks on MP Advantages simple! correctly picks up atoms in the case of existing sparse solution useful for compression-noiseless case Disadvantages heavy computation noisy case? Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 8
9 Outline 1. Introduction 2. MP shrinkage algorithm 3. TV dictionary model 4. TV wavelet shrinkage 5. Conclusions Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 9
10 Wavelet shrinkage Let D = (ψ i ) i I H be a wavelet basis. For the noisy image v, this method takes: u = θ τ ( v, ψ i )ψ i, (2) i I where τ is a fixed positive and θ τ is a shrinkage function. Typical examples: (soft thresholding) ρ τ (t) = { ( t τ)sgn(t), when t τ; 0, otherwise, (3) (hard thresholding) h τ (t) = { t, when t τ; 0, otherwise. (4) Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 10
11 Why shrinkage on dictionary On can observe that: when D is wavelet basis, MP is exactly wavelet-shrinkage with hard threshold function. Is soft-shrinkage better than hard-shrinkage? Moreover, denote u n def = R n v b. As ψ γn D is selected by MP, we have: R n v, ψ γn ψ γn = u n, ψ γn ψ γn + b, ψ γn ψ γn. It is inappropriate to replace brutally u n, ψ γn ψ γn by R n v, ψ γn ψ γn. We propose to shrink R n v, ψ γn at each iteration of MP. Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 11
12 General shrinkage functions Definition 1. A function θ( ) : R R is called a shrinkage function if and only if it satisfies: 1. θ( ) is nondecreasing, i.e, t, t R, t t = θ(t) θ(t ); 2. θ( ) is a shrinkage, i.e, t R, θ(t) t. Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 12
13 The MP shrinkage algorithm For v H fixed. Let R 0 v = v, a predefined factor α (0, 1]. Iterate for n N: find an atom ψ γn D: ψ γn, R n v α sup i I R n v, ψ i. sub-decompose R n v as R n v = s n ψ γn + R n+1 v, (5) where s n = θ(m n ) with M n = R n v, ψ γn. (6) Finally, take:u = + n=0 s nψ γn. A. θ = Id: usual MP; B. D wavelet basis: wavelet shrinkage Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 13
14 Convergence of MP shrinkage Does MP shrinkage converge? Theorem 1. Let (ψ i ) i I be a normed dictionary, v H and θ( ) be a shrinkage function. The sequences defined in Eq.6 satisfy: (R n v) n N converges. As a consequence, + n=0 s n ψ γn exists. We denote the limit of (R n v) n N by R + v and we trivially have v = + n=0 s n ψ γn + R + v. Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 14
15 Bound on l 1 regularity interior threshold: τ def = inf t:θ(t) 0 t exterior threshold: τ + def = sup t:θ(t)=0 t. thresholding function: iff τ > 0. Theorem 2. Let (ψ i ) i I be a normed dictionary, v H and θ( ) be a thresholding function. The quantities defined in Eq.6 satisfy: + n=0 s n v 2 R + v 2 τ v 2 τ, (7) where τ > 0 denotes the interior threshold. Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 15
16 Bound on the residual norm Let us define the semi-norm on H, u D def = sup i I u, ψ i, u H. Let V = Span{D}, V V = H. Theorem 3. Let (ψ i ) i I be a normed dictionary, v H and θ( ) be a shrinkage function. The limits of MP shrinkage satisfy R + v P V v D τ + α, and + n=0 where τ + is the exterior threshold. s n ψ γn P V v τ + α, D Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 16
17 Experiments for MP/MP shrinkage Figure 1: Basic filters to construct the dictionary. Each filter is extended to the same size as the underlying noisy image by zero-padding and then translate over the plan. Left: DCT filters; right: nine letter filters. Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 17
18 Convergence: s n MP τ=10 τ=50 τ= Figure 2: The quantity s n (y-axis) as a function of the iteration number n (x-axis). These are for the MP shrinkage with the soft thresholding function for τ = 0 (i.e MP), τ = 10, 50 and 100. Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 18
19 Detection by letter dictionary Figure 3: Top-left: clean image; top-middle: noisy image of Gaussian noise std 150; top-right: denoised-image by ROF (λ = ); bottom-left: wavelet soft-shrinkage (τ = 400); bottom middle: MP by letter dictionary; bottom right: MP shrinkage by letter dictionary (soft-shrinkage with τ = 400). Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 19
20 Outline 1. Introduction 2. MP shrinkage algorithm 3. TV dictionary model 4. TV wavelet shrinkage 5. Conclusions Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 20
21 Total variation dictionary model The task of image denoising is to recover an ideal image u L 2 (Ω) from a noisy observation: v = u + b, where Ω is a rectangle of R 2 to define the image, v L 2 (Ω) is the noisy image and b L 2 (Ω) is Gaussian noise of standard variation σ. We are interested in the following total variation dictionary model: (P ) : { min T V (w) subject to w v, ψ τ, ψ D, for a finite dictionary D L 2 (Ω) which is often symmetric and a positive parameter τ associated with the noise level. Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 21
22 Remarks on (P ) Advantages better than ROF model, texture recovering rather flexible Key problem how to design the dictionary Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 22
23 KTT parameters Suppose w is solution of (P ). Using Kuhn-Tucker Theorem, we know that there exist positive Lagrangian parameters (λ ψ ) ψ D such that: ψ D λ ψψ = ( w w ). (8) Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 23
24 Dural form Let T V : conjugate function of T V. Let f : H [, + ] be a convex function. The Bregman distance associated with f for points p, q H is: B f (p, q) = f(p) f(q) f(q), p q, where f(q) is a subgradient of f at the point q. Theorem 1. The dural problem of (P ) is: min (λ ψ ) ψ D 0 B T V λ ψ ψ, ( v v ) + τ λ ψ, (9) ψ D ψ D where B T V is the Bregman distance associated to T V. Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 24
25 Ad-hoc dictionary The sparsest case: curvature of ideal image Figure 4: Left: curvature of Lena image; right: curvature of letters. Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 25
26 Figure 5: Denoising by (P ) with ad-hoc dictionary and ROF. Top: clean image, noisy image (σ = 20, PSNR = 22.11); middle: result of ROF (PSNR = 27.66), result of (P ) with ad-hoc dictionary (PSNR = 34.93); bottom: residue of ROF and (P ). Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 26
27 Figure 6: Image decomposition: Top: clean image, noisy image to decompose, obtained by 20% impulse noise; middle: cartoon part, noisy-texture part of ROF model; bottom: letter part, background-noisy part of model (P ). Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 27
28 Figure 7: Image denoising. Top: clean image, noisy image with σ = 20, P SNR = 22.08; middle: denoise result of ROF with P SNR = 24.56, residue of ROF; bottom: denoise result of (P ) with P SNR = 31.20, residue of (P ). Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 28
29 Outline 1. Introduction 2. MP shrinkage algorithm 3. TV dictionary model 4. TV wavelet shrinkage 5. Conclusions Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 29
30 Total variation based shrinkage model Total variation model wavelet shrinkage min w H min r H 1 2 f w βt V (w) 1 2 f r α r, ϕ ϕ D TV wavelet shrinkage min E(r, w) := 1 w,r H 2 f r w α r, ϕ + βt V (w), ϕ D Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 30
31 Alternating minimization direction algorithm 1. Initialized r Repeat until convergence: - update w n = arg min w 1 2 (f r n 1) w βt V (w), (10) 1 r n = arg min r 2 (f w n) r α r, ϕ ; (11) ϕ D - take u n = w n + r n. Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 31
32 Strong convergence Theorem 2. We have: 1. the sequence (w n, r n ) converges strongly to a global minimal point of E(r, w). 2. the sequence (u n ) converges strongly to a unique point regardless the initialization of (r 0, w 0 ). Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 32
33 Experiments on Cameraman Figure 8: Top-left: clean image; top-left: noisy image of Gaussian std 20; bottom-left: wavelet soft-shrinkage with α = 50, SN R = 11.89; bottom-middle: ROF, SN R = 13.61; bottom-right: new model with α = 50, β = 60, SNR = Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 33
34 Conclusion 1. MP shrinkage algorithm 2. TV dictionary model 3. TV wavelet shrinkage Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last 34
Matching Pursuit Shrinkage in Hilbert Spaces
Matching Pursuit Shrinkage in Hilbert Spaces Tieyong Zeng 1, and Malgouyes François 2 1 Abstract This paper contains the research on a hybrid algorithm combining the Matching Pursuit (MP) and the wavelet
More informationSparse linear models and denoising
Lecture notes 4 February 22, 2016 Sparse linear models and denoising 1 Introduction 1.1 Definition and motivation Finding representations of signals that allow to process them more effectively is a central
More informationSparse & Redundant Signal Representation, and its Role in Image Processing
Sparse & Redundant Signal Representation, and its Role in Michael Elad The CS Department The Technion Israel Institute of technology Haifa 3000, Israel Wave 006 Wavelet and Applications Ecole Polytechnique
More informationThe Iteration-Tuned Dictionary for Sparse Representations
The Iteration-Tuned Dictionary for Sparse Representations Joaquin Zepeda #1, Christine Guillemot #2, Ewa Kijak 3 # INRIA Centre Rennes - Bretagne Atlantique Campus de Beaulieu, 35042 Rennes Cedex, FRANCE
More informationSparse linear models
Sparse linear models Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 2/22/2016 Introduction Linear transforms Frequency representation Short-time
More informationSIGNAL SEPARATION USING RE-WEIGHTED AND ADAPTIVE MORPHOLOGICAL COMPONENT ANALYSIS
TR-IIS-4-002 SIGNAL SEPARATION USING RE-WEIGHTED AND ADAPTIVE MORPHOLOGICAL COMPONENT ANALYSIS GUAN-JU PENG AND WEN-LIANG HWANG Feb. 24, 204 Technical Report No. TR-IIS-4-002 http://www.iis.sinica.edu.tw/page/library/techreport/tr204/tr4.html
More informationLINEARIZED BREGMAN ITERATIONS FOR FRAME-BASED IMAGE DEBLURRING
LINEARIZED BREGMAN ITERATIONS FOR FRAME-BASED IMAGE DEBLURRING JIAN-FENG CAI, STANLEY OSHER, AND ZUOWEI SHEN Abstract. Real images usually have sparse approximations under some tight frame systems derived
More informationMATCHING-PURSUIT DICTIONARY PRUNING FOR MPEG-4 VIDEO OBJECT CODING
MATCHING-PURSUIT DICTIONARY PRUNING FOR MPEG-4 VIDEO OBJECT CODING Yannick Morvan, Dirk Farin University of Technology Eindhoven 5600 MB Eindhoven, The Netherlands email: {y.morvan;d.s.farin}@tue.nl Peter
More informationMorphological Diversity and Source Separation
Morphological Diversity and Source Separation J. Bobin, Y. Moudden, J.-L. Starck, and M. Elad Abstract This paper describes a new method for blind source separation, adapted to the case of sources having
More informationc 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE
METHODS AND APPLICATIONS OF ANALYSIS. c 2011 International Press Vol. 18, No. 1, pp. 105 110, March 2011 007 EXACT SUPPORT RECOVERY FOR LINEAR INVERSE PROBLEMS WITH SPARSITY CONSTRAINTS DENNIS TREDE Abstract.
More informationSparsity in Underdetermined Systems
Sparsity in Underdetermined Systems Department of Statistics Stanford University August 19, 2005 Classical Linear Regression Problem X n y p n 1 > Given predictors and response, y Xβ ε = + ε N( 0, σ 2
More informationImage Processing by the Curvelet Transform
Image Processing by the Curvelet Transform Jean Luc Starck Dapnia/SEDI SAP, CEA Saclay, France. jstarck@cea.fr http://jstarck.free.fr Collaborators: D.L. Donoho, Department of Statistics, Stanford E. Candès,
More informationWAVELET RECONSTRUCTION OF NONLINEAR DYNAMICS
International Journal of Bifurcation and Chaos, Vol. 8, No. 11 (1998) 2191 2201 c World Scientific Publishing Company WAVELET RECONSTRUCTION OF NONLINEAR DYNAMICS DAVID ALLINGHAM, MATTHEW WEST and ALISTAIR
More informationA simple test to check the optimality of sparse signal approximations
A simple test to check the optimality of sparse signal approximations Rémi Gribonval, Rosa Maria Figueras I Ventura, Pierre Vergheynst To cite this version: Rémi Gribonval, Rosa Maria Figueras I Ventura,
More informationInstitute for Computational Mathematics Hong Kong Baptist University
Institute for Computational Mathematics Hong Kong Baptist University ICM Research Report 08-4 ALTERNATING MINIMIZATION DIRECTION METHOD FOR A NOVEL TOTAL VARIATION BASED WAVELET SHRINKAGE MODEL TIEYONG
More informationMATCHING PURSUIT WITH STOCHASTIC SELECTION
2th European Signal Processing Conference (EUSIPCO 22) Bucharest, Romania, August 27-3, 22 MATCHING PURSUIT WITH STOCHASTIC SELECTION Thomas Peel, Valentin Emiya, Liva Ralaivola Aix-Marseille Université
More informationEUSIPCO
EUSIPCO 013 1569746769 SUBSET PURSUIT FOR ANALYSIS DICTIONARY LEARNING Ye Zhang 1,, Haolong Wang 1, Tenglong Yu 1, Wenwu Wang 1 Department of Electronic and Information Engineering, Nanchang University,
More informationTutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London
Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation Pier Luigi Dragotti Imperial College London Outline Part 1: Sparse Signal Representation ~90min Part 2: Sparse Sampling ~90min 2
More informationImage Denoising using Uniform Curvelet Transform and Complex Gaussian Scale Mixture
EE 5359 Multimedia Processing Project Report Image Denoising using Uniform Curvelet Transform and Complex Gaussian Scale Mixture By An Vo ISTRUCTOR: Dr. K. R. Rao Summer 008 Image Denoising using Uniform
More informationDenosing Using Wavelets and Projections onto the l 1 -Ball
1 Denosing Using Wavelets and Projections onto the l 1 -Ball October 6, 2014 A. Enis Cetin, M. Tofighi Dept. of Electrical and Electronic Engineering, Bilkent University, Ankara, Turkey cetin@bilkent.edu.tr,
More informationAbout Split Proximal Algorithms for the Q-Lasso
Thai Journal of Mathematics Volume 5 (207) Number : 7 http://thaijmath.in.cmu.ac.th ISSN 686-0209 About Split Proximal Algorithms for the Q-Lasso Abdellatif Moudafi Aix Marseille Université, CNRS-L.S.I.S
More informationNew Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit
New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence
More informationAccelerated Dual Gradient-Based Methods for Total Variation Image Denoising/Deblurring Problems (and other Inverse Problems)
Accelerated Dual Gradient-Based Methods for Total Variation Image Denoising/Deblurring Problems (and other Inverse Problems) Donghwan Kim and Jeffrey A. Fessler EECS Department, University of Michigan
More informationConvex Optimization and l 1 -minimization
Convex Optimization and l 1 -minimization Sangwoon Yun Computational Sciences Korea Institute for Advanced Study December 11, 2009 2009 NIMS Thematic Winter School Outline I. Convex Optimization II. l
More informationPre-weighted Matching Pursuit Algorithms for Sparse Recovery
Journal of Information & Computational Science 11:9 (214) 2933 2939 June 1, 214 Available at http://www.joics.com Pre-weighted Matching Pursuit Algorithms for Sparse Recovery Jingfei He, Guiling Sun, Jie
More informationL-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise
L-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise Srdjan Stanković, Irena Orović and Moeness Amin 1 Abstract- A modification of standard
More informationSparse analysis Lecture III: Dictionary geometry and greedy algorithms
Sparse analysis Lecture III: Dictionary geometry and greedy algorithms Anna C. Gilbert Department of Mathematics University of Michigan Intuition from ONB Key step in algorithm: r, ϕ j = x c i ϕ i, ϕ j
More informationImage compression using an edge adapted redundant dictionary and wavelets
Image compression using an edge adapted redundant dictionary and wavelets Lorenzo Peotta, Lorenzo Granai, Pierre Vandergheynst Signal Processing Institute Swiss Federal Institute of Technology EPFL-STI-ITS-LTS2
More informationSGN Advanced Signal Processing Project bonus: Sparse model estimation
SGN 21006 Advanced Signal Processing Project bonus: Sparse model estimation Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 12 Sparse models Initial problem: solve
More informationLow-Complexity Image Denoising via Analytical Form of Generalized Gaussian Random Vectors in AWGN
Low-Complexity Image Denoising via Analytical Form of Generalized Gaussian Random Vectors in AWGN PICHID KITTISUWAN Rajamangala University of Technology (Ratanakosin), Department of Telecommunication Engineering,
More informationOrthogonal Matching Pursuit for Sparse Signal Recovery With Noise
Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published
More informationLearning an Adaptive Dictionary Structure for Efficient Image Sparse Coding
Learning an Adaptive Dictionary Structure for Efficient Image Sparse Coding Jérémy Aghaei Mazaheri, Christine Guillemot, Claude Labit To cite this version: Jérémy Aghaei Mazaheri, Christine Guillemot,
More informationReview: Learning Bimodal Structures in Audio-Visual Data
Review: Learning Bimodal Structures in Audio-Visual Data CSE 704 : Readings in Joint Visual, Lingual and Physical Models and Inference Algorithms Suren Kumar Vision and Perceptual Machines Lab 106 Davis
More informationSPARSE signal representations have gained popularity in recent
6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying
More information3D INTERPOLATION USING HANKEL TENSOR COMPLETION BY ORTHOGONAL MATCHING PURSUIT A. Adamo, P. Mazzucchelli Aresys, Milano, Italy
3D INTERPOLATION USING HANKEL TENSOR COMPLETION BY ORTHOGONAL MATCHING PURSUIT A. Adamo, P. Mazzucchelli Aresys, Milano, Italy Introduction. Seismic data are often sparsely or irregularly sampled along
More information2 Regularized Image Reconstruction for Compressive Imaging and Beyond
EE 367 / CS 448I Computational Imaging and Display Notes: Compressive Imaging and Regularized Image Reconstruction (lecture ) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement
More informationEquivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 19, NO. 12, DECEMBER 2008 2009 Equivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation Yuanqing Li, Member, IEEE, Andrzej Cichocki,
More informationGeneralized Orthogonal Matching Pursuit- A Review and Some
Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents
More informationA Continuation Approach to Estimate a Solution Path of Mixed L2-L0 Minimization Problems
A Continuation Approach to Estimate a Solution Path of Mixed L2-L Minimization Problems Junbo Duan, Charles Soussen, David Brie, Jérôme Idier Centre de Recherche en Automatique de Nancy Nancy-University,
More informationAn Introduction to Sparse Approximation
An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,
More informationImage Noise: Detection, Measurement and Removal Techniques. Zhifei Zhang
Image Noise: Detection, Measurement and Removal Techniques Zhifei Zhang Outline Noise measurement Filter-based Block-based Wavelet-based Noise removal Spatial domain Transform domain Non-local methods
More informationLEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING. Saiprasad Ravishankar and Yoram Bresler
LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING Saiprasad Ravishankar and Yoram Bresler Department of Electrical and Computer Engineering and the Coordinated Science Laboratory, University
More informationA Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization
A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization Panos Parpas Department of Computing Imperial College London www.doc.ic.ac.uk/ pp500 p.parpas@imperial.ac.uk jointly with D.V.
More informationStatistical approach for dictionary learning
Statistical approach for dictionary learning Tieyong ZENG Joint work with Alain Trouvé Page 1 Introduction Redundant dictionary Coding, denoising, compression. Existing algorithms to generate dictionary
More informationBhaskar Rao Department of Electrical and Computer Engineering University of California, San Diego
Bhaskar Rao Department of Electrical and Computer Engineering University of California, San Diego 1 Outline Course Outline Motivation for Course Sparse Signal Recovery Problem Applications Computational
More informationMLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT
MLCC 2018 Variable Selection and Sparsity Lorenzo Rosasco UNIGE-MIT-IIT Outline Variable Selection Subset Selection Greedy Methods: (Orthogonal) Matching Pursuit Convex Relaxation: LASSO & Elastic Net
More informationOslo Class 6 Sparsity based regularization
RegML2017@SIMULA Oslo Class 6 Sparsity based regularization Lorenzo Rosasco UNIGE-MIT-IIT May 4, 2017 Learning from data Possible only under assumptions regularization min Ê(w) + λr(w) w Smoothness Sparsity
More informationAdaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise
Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise Minru Bai(x T) College of Mathematics and Econometrics Hunan University Joint work with Xiongjun Zhang, Qianqian Shao June 30,
More informationMaking Flippy Floppy
Making Flippy Floppy James V. Burke UW Mathematics jvburke@uw.edu Aleksandr Y. Aravkin IBM, T.J.Watson Research sasha.aravkin@gmail.com Michael P. Friedlander UBC Computer Science mpf@cs.ubc.ca Vietnam
More informationMessage Passing Algorithms for Compressed Sensing: II. Analysis and Validation
Message Passing Algorithms for Compressed Sensing: II. Analysis and Validation David L. Donoho Department of Statistics Arian Maleki Department of Electrical Engineering Andrea Montanari Department of
More informationMachine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013
Machine Learning for Signal Processing Sparse and Overcomplete Representations Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 1 Key Topics in this Lecture Basics Component-based representations
More informationCompressed Sensing and Neural Networks
and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications
More information1 Sparsity and l 1 relaxation
6.883 Learning with Combinatorial Structure Note for Lecture 2 Author: Chiyuan Zhang Sparsity and l relaxation Last time we talked about sparsity and characterized when an l relaxation could recover the
More informationMaking Flippy Floppy
Making Flippy Floppy James V. Burke UW Mathematics jvburke@uw.edu Aleksandr Y. Aravkin IBM, T.J.Watson Research sasha.aravkin@gmail.com Michael P. Friedlander UBC Computer Science mpf@cs.ubc.ca Current
More informationCompressed Sensing: Extending CLEAN and NNLS
Compressed Sensing: Extending CLEAN and NNLS Ludwig Schwardt SKA South Africa (KAT Project) Calibration & Imaging Workshop Socorro, NM, USA 31 March 2009 Outline 1 Compressed Sensing (CS) Introduction
More informationof Orthogonal Matching Pursuit
A Sharp Restricted Isometry Constant Bound of Orthogonal Matching Pursuit Qun Mo arxiv:50.0708v [cs.it] 8 Jan 205 Abstract We shall show that if the restricted isometry constant (RIC) δ s+ (A) of the measurement
More informationIntroduction to Compressed Sensing
Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral
More informationA new method on deterministic construction of the measurement matrix in compressed sensing
A new method on deterministic construction of the measurement matrix in compressed sensing Qun Mo 1 arxiv:1503.01250v1 [cs.it] 4 Mar 2015 Abstract Construction on the measurement matrix A is a central
More informationMultiple Change Point Detection by Sparse Parameter Estimation
Multiple Change Point Detection by Sparse Parameter Estimation Department of Econometrics Fac. of Economics and Management University of Defence Brno, Czech Republic Dept. of Appl. Math. and Comp. Sci.
More informationDetecting Sparse Structures in Data in Sub-Linear Time: A group testing approach
Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach Boaz Nadler The Weizmann Institute of Science Israel Joint works with Inbal Horev, Ronen Basri, Meirav Galun and Ery Arias-Castro
More informationsparse and low-rank tensor recovery Cubic-Sketching
Sparse and Low-Ran Tensor Recovery via Cubic-Setching Guang Cheng Department of Statistics Purdue University www.science.purdue.edu/bigdata CCAM@Purdue Math Oct. 27, 2017 Joint wor with Botao Hao and Anru
More informationDesign of Image Adaptive Wavelets for Denoising Applications
Design of Image Adaptive Wavelets for Denoising Applications Sanjeev Pragada and Jayanthi Sivaswamy Center for Visual Information Technology International Institute of Information Technology - Hyderabad,
More informationBayesian Paradigm. Maximum A Posteriori Estimation
Bayesian Paradigm Maximum A Posteriori Estimation Simple acquisition model noise + degradation Constraint minimization or Equivalent formulation Constraint minimization Lagrangian (unconstraint minimization)
More informationEE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6)
EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement to the material discussed in
More informationPatch similarity under non Gaussian noise
The 18th IEEE International Conference on Image Processing Brussels, Belgium, September 11 14, 011 Patch similarity under non Gaussian noise Charles Deledalle 1, Florence Tupin 1, Loïc Denis 1 Institut
More informationMIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications. Class 08: Sparsity Based Regularization. Lorenzo Rosasco
MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications Class 08: Sparsity Based Regularization Lorenzo Rosasco Learning algorithms so far ERM + explicit l 2 penalty 1 min w R d n n l(y
More informationSparse Time-Frequency Transforms and Applications.
Sparse Time-Frequency Transforms and Applications. Bruno Torrésani http://www.cmi.univ-mrs.fr/~torresan LATP, Université de Provence, Marseille DAFx, Montreal, September 2006 B. Torrésani (LATP Marseille)
More informationGauge optimization and duality
1 / 54 Gauge optimization and duality Junfeng Yang Department of Mathematics Nanjing University Joint with Shiqian Ma, CUHK September, 2015 2 / 54 Outline Introduction Duality Lagrange duality Fenchel
More informationEstimation Error Bounds for Frame Denoising
Estimation Error Bounds for Frame Denoising Alyson K. Fletcher and Kannan Ramchandran {alyson,kannanr}@eecs.berkeley.edu Berkeley Audio-Visual Signal Processing and Communication Systems group Department
More informationApproximate Message Passing Algorithms
November 4, 2017 Outline AMP (Donoho et al., 2009, 2010a) Motivations Derivations from a message-passing perspective Limitations Extensions Generalized Approximate Message Passing (GAMP) (Rangan, 2011)
More informationWavelet Footprints: Theory, Algorithms, and Applications
1306 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 51, NO. 5, MAY 2003 Wavelet Footprints: Theory, Algorithms, and Applications Pier Luigi Dragotti, Member, IEEE, and Martin Vetterli, Fellow, IEEE Abstract
More informationSparsity and Compressed Sensing
Sparsity and Compressed Sensing Jalal Fadili Normandie Université-ENSICAEN, GREYC Mathematical coffees 2017 Recap: linear inverse problems Dictionary Sensing Sensing Sensing = y m 1 H m n y y 2 R m H A
More informationSEQUENTIAL SUBSPACE FINDING: A NEW ALGORITHM FOR LEARNING LOW-DIMENSIONAL LINEAR SUBSPACES.
SEQUENTIAL SUBSPACE FINDING: A NEW ALGORITHM FOR LEARNING LOW-DIMENSIONAL LINEAR SUBSPACES Mostafa Sadeghi a, Mohsen Joneidi a, Massoud Babaie-Zadeh a, and Christian Jutten b a Electrical Engineering Department,
More informationSparsifying Transform Learning for Compressed Sensing MRI
Sparsifying Transform Learning for Compressed Sensing MRI Saiprasad Ravishankar and Yoram Bresler Department of Electrical and Computer Engineering and Coordinated Science Laborarory University of Illinois
More informationCOMPARATIVE ANALYSIS OF ORTHOGONAL MATCHING PURSUIT AND LEAST ANGLE REGRESSION
COMPARATIVE ANALYSIS OF ORTHOGONAL MATCHING PURSUIT AND LEAST ANGLE REGRESSION By Mazin Abdulrasool Hameed A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for
More information2.3. Clustering or vector quantization 57
Multivariate Statistics non-negative matrix factorisation and sparse dictionary learning The PCA decomposition is by construction optimal solution to argmin A R n q,h R q p X AH 2 2 under constraint :
More informationMachine Learning for Signal Processing Sparse and Overcomplete Representations
Machine Learning for Signal Processing Sparse and Overcomplete Representations Abelino Jimenez (slides from Bhiksha Raj and Sourish Chaudhuri) Oct 1, 217 1 So far Weights Data Basis Data Independent ICA
More informationLarge-Scale L1-Related Minimization in Compressive Sensing and Beyond
Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March
More informationSimultaneous Sparsity
Simultaneous Sparsity Joel A. Tropp Anna C. Gilbert Martin J. Strauss {jtropp annacg martinjs}@umich.edu Department of Mathematics The University of Michigan 1 Simple Sparse Approximation Work in the d-dimensional,
More informationAlternating Minimization Method for Total Variation Based Wavelet Shrinkage Model
Commun. Comput. Phys. doi: 10.4208/cicp.210709.180310a Vol. 8, No. 5, pp. 976-994 November 2010 Alternating Minimization Method for Total Variation Based Wavelet Shrinkage Model Tieyong Zeng 1, Xiaolong
More informationTruncation Strategy of Tensor Compressive Sensing for Noisy Video Sequences
Journal of Information Hiding and Multimedia Signal Processing c 2016 ISSN 207-4212 Ubiquitous International Volume 7, Number 5, September 2016 Truncation Strategy of Tensor Compressive Sensing for Noisy
More informationInverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France
Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Structure of the tutorial Session 1: Introduction to inverse problems & sparse
More informationA Dual Sparse Decomposition Method for Image Denoising
A Dual Sparse Decomposition Method for Image Denoising arxiv:1704.07063v1 [cs.cv] 24 Apr 2017 Hong Sun 1 School of Electronic Information Wuhan University 430072 Wuhan, China 2 Dept. Signal and Image Processing
More informationRandomness-in-Structured Ensembles for Compressed Sensing of Images
Randomness-in-Structured Ensembles for Compressed Sensing of Images Abdolreza Abdolhosseini Moghadam Dep. of Electrical and Computer Engineering Michigan State University Email: abdolhos@msu.edu Hayder
More informationarxiv: v1 [stat.ml] 1 Mar 2015
Matrix Completion with Noisy Entries and Outliers Raymond K. W. Wong 1 and Thomas C. M. Lee 2 arxiv:1503.00214v1 [stat.ml] 1 Mar 2015 1 Department of Statistics, Iowa State University 2 Department of Statistics,
More informationElaine T. Hale, Wotao Yin, Yin Zhang
, Wotao Yin, Yin Zhang Department of Computational and Applied Mathematics Rice University McMaster University, ICCOPT II-MOPTA 2007 August 13, 2007 1 with Noise 2 3 4 1 with Noise 2 3 4 1 with Noise 2
More informationSparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Final Presentation
Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Final Presentation Alfredo Nava-Tudela John J. Benedetto, advisor 5/10/11 AMSC 663/664 1 Problem Let A be an n
More informationSOS Boosting of Image Denoising Algorithms
SOS Boosting of Image Denoising Algorithms Yaniv Romano and Michael Elad The Technion Israel Institute of technology Haifa 32000, Israel The research leading to these results has received funding from
More informationMMSE Denoising of 2-D Signals Using Consistent Cycle Spinning Algorithm
Denoising of 2-D Signals Using Consistent Cycle Spinning Algorithm Bodduluri Asha, B. Leela kumari Abstract: It is well known that in a real world signals do not exist without noise, which may be negligible
More informationEE 381V: Large Scale Optimization Fall Lecture 24 April 11
EE 381V: Large Scale Optimization Fall 2012 Lecture 24 April 11 Lecturer: Caramanis & Sanghavi Scribe: Tao Huang 24.1 Review In past classes, we studied the problem of sparsity. Sparsity problem is that
More informationOPTIMAL SURE PARAMETERS FOR SIGMOIDAL WAVELET SHRINKAGE
17th European Signal Processing Conference (EUSIPCO 009) Glasgow, Scotland, August 4-8, 009 OPTIMAL SURE PARAMETERS FOR SIGMOIDAL WAVELET SHRINKAGE Abdourrahmane M. Atto 1, Dominique Pastor, Gregoire Mercier
More informationRobust Sparse Recovery via Non-Convex Optimization
Robust Sparse Recovery via Non-Convex Optimization Laming Chen and Yuantao Gu Department of Electronic Engineering, Tsinghua University Homepage: http://gu.ee.tsinghua.edu.cn/ Email: gyt@tsinghua.edu.cn
More informationPractical Signal Recovery from Random Projections
Practical Signal Recovery from Random Projections Emmanuel Candès and Justin Romberg Abstract Can we recover a signal f R N from a small number of linear measurements? A series of recent papers developed
More informationA NEW FRAMEWORK FOR DESIGNING INCOHERENT SPARSIFYING DICTIONARIES
A NEW FRAMEWORK FOR DESIGNING INCOERENT SPARSIFYING DICTIONARIES Gang Li, Zhihui Zhu, 2 uang Bai, 3 and Aihua Yu 3 School of Automation & EE, Zhejiang Univ. of Sci. & Tech., angzhou, Zhejiang, P.R. China
More informationAn iterative hard thresholding estimator for low rank matrix recovery
An iterative hard thresholding estimator for low rank matrix recovery Alexandra Carpentier - based on a joint work with Arlene K.Y. Kim Statistical Laboratory, Department of Pure Mathematics and Mathematical
More informationDenoising and Compression Using Wavelets
Denoising and Compression Using Wavelets December 15,2016 Juan Pablo Madrigal Cianci Trevor Giannini Agenda 1 Introduction Mathematical Theory Theory MATLAB s Basic Commands De-Noising: Signals De-Noising:
More informationSparse Optimization Lecture: Basic Sparse Optimization Models
Sparse Optimization Lecture: Basic Sparse Optimization Models Instructor: Wotao Yin July 2013 online discussions on piazza.com Those who complete this lecture will know basic l 1, l 2,1, and nuclear-norm
More informationSparse Solutions of Systems of Equations and Sparse Modelling of Signals and Images
Sparse Solutions of Systems of Equations and Sparse Modelling of Signals and Images Alfredo Nava-Tudela ant@umd.edu John J. Benedetto Department of Mathematics jjb@umd.edu Abstract In this project we are
More informationLearning MMSE Optimal Thresholds for FISTA
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Learning MMSE Optimal Thresholds for FISTA Kamilov, U.; Mansour, H. TR2016-111 August 2016 Abstract Fast iterative shrinkage/thresholding algorithm
More informationSignal Denoising with Wavelets
Signal Denoising with Wavelets Selin Aviyente Department of Electrical and Computer Engineering Michigan State University March 30, 2010 Introduction Assume an additive noise model: x[n] = f [n] + w[n]
More information