Reweighted Laplace Prior Based Hyperspectral Compressive Sensing for Unknown Sparsity: Supplementary Material
|
|
- Lynne Preston
- 5 years ago
- Views:
Transcription
1 Reweighted Laplace Prior Based Hyperspectral Compressive Sensing for Unknown Sparsity: Supplementary Material Lei Zhang, Wei Wei, Yanning Zhang, Chunna Tian, Fei Li School of Computer Science, Northwestern Polytechnical University, Xi an, 77, China School of Electronic Engineering, Xidian University, Xi an, 771, China {weiweinwpu, This document is the Supplementary Material for the CVPR 1 paper Reweighted Laplace Prior Based Hyperspectral Compressive Sensing for Unknown Sparsity, which includes three contents: Section1: Derivation of the reweighted Laplace prior. Section: Derivation of the optimization procedure. Section3: More experimental results. 1. Derivation of the Reweighted Laplace Prior In this section, we present the detailed derivation of the intuitive reweighted Laplace prior from the hierarchical structure. For each columny i iny, we have py i py i γpγ κdγ { } exp 1 yt i Σ n y y b i n b j1 n b j1 n b j1 π n b/ Σ y 1/ exp κ j exp j1 { } y ji κj γ j exp κ jγ j κ jγ j dγ dγ π nb/ Σ y 1/ κ j / j exp y ji +κ jγj γ j dγ j π 1/γ/ κ 1/ j exp κ j y ji exp Ky i 1 n b K, 1/ κj y ji π exp κ κ j yji j yji }{{} GIGγ j κ j,y ji,1/ γ / j exp y ji +κ jγ j γ j where Σ y diag [γ 1,...,γ nb ] T1 is the covariance matrix of Y, K diag [ κ 1,..., κ nb ] T is the weight matrix and GIGγ j κ j,y ji,1/ represents the pdf of generalized inverse Gaussian distribution. It can be seen that the proposed hierarchical prior results in a reweighted Laplace prior on each column ofy with weight matrix K. indicates equal contributions. 1 For a vector x, diagx denotes a diagonal matrix with elements from x. For a matrix X, diagx denotes extracting the diagonal elements from X to form a vector. dγ j /1/$31. 1 IEEE
2 . Derivation of the Optimization Procedure In this section, we present the detailed derivation of sparsity learning over γ and noise estimation over λ in the optimization procedure of the main paper..1. Sparse Signal Recovery: Optimizing for γ GivenY,λand κ, the sub-problem overγ is derived as γ Y Σ y +log Σ by + whereσ by Σ n +ADΣ y D T A T. According to matrix algebra, we rewritelog Σ by term as i1 κ i γ i logκ i n p log Σ by log Σ n +ADΣ y D T A T log I mb +Σ n ADΣ y D T A T +log Σ n log Σ y +D T A T Σ n AD +log Σ y +log Σ n, 3 where I mb R m b m b is an identity matrix. Substituting Eq. 3 into Eq. and removing those irreverent terms to γ, we have a new sub-problem overγ as γ Y Σ y +log Σ y +D T A T Σ n AD +log Σ y + 1 n p i1 κ i γ i 4 Since the concave function f γ log Σ y +D T A T Σ n AD over γ [γ1,...,γ n b ] T results in Eq. 4 to be a nonconvex optimization, we utilize a conjugate function in convex optimization [3] to transform this noncovex optimization into a convex one by finding a strict upper bound off γ f γ log Σ y +D T A T Σ n AD z T γ f z, z where f z is the concave conjugate function of f γ, z [z 1,...,z nb ] T. It can be proved that the equation in Eq. only succeeds when z i γ log Σ y +D T A T Σ n AD i [ Σ tr y +D T A T Σ n AD γ Σ y +D T A T Σ n AD ] i tr Σ y +D T A T Σ n AD e γ j e T j +D T A T Σ n AD i γ j1 i [ Σ tr y +D T A T Σ n AD ] ei e T i tr [ e T i Σ y +D T A T Σ n AD ei ], where e i is the ith standard unit vector in R n b. To decrease the inverse operations in Eq. 6, we use matrix algebra to simplify the formula as [ z i tr e T i Σ y +D T A T Σ n AD ] ei { tr e T i [Σ y Σ y D T A T Σ n +ADΣ y D T A T } ADΣy ]e i 7 [ ] tr e T i Σ y Σ y D T A T Σ by ADΣ y e i. In summary, the formula ofz can be written as z diag Σ y Σ y D T A T Σ by ADΣ y, 8 6
3 Substituting Eq. into Eq. 4, we obtain the revised subproblem γ Y Σ y +z T γ +log Σ y + 1 n p i1 n b ȳ κ i γ i i +z i whereȳ i denotes theith element of ȳ diagyy T [ȳ 1,...,ȳ nb ] T. Since γ, the solution of γ i is 4κ i n p ȳ γi new i +z i+n p n p. κ i.. Noise Estimation: Optimizing for λ With the recovered sparse signaly and prior parameter γ, we have the following sub-problem overλ i1 γ i +logγ i + κ iγ i n p, 9 ADY G +log Σ by 11 λ np Σ n whereσ by Σ n +ADΣ y D T A T. Similar to the optimization ofγ, we a conjugate function to find the strict upper bound of the concave functionφλ log Σ by log Σ n +ADΣ y D T A T as follows φλ log Σ n +ADΣ y D T A T α T λ φ α, α 1 whereφ α is the concave conjugate function ofφλ,σ by Σ n +ADΣ y D T A T,Σ n diagλ andα [α 1,...,α mb ] T. It can be proved that the equation of Eq. 1 only succeeds when α i λi log Σ n +ADΣ y D T A T [ tr ξ T i Σn +ADΣ y D T A T ] ξi tr ξ T i Σ by ξ i 13 whereξ i is theith standard unit vector inr m b. Further, the formula of α can be written as α diag Σ by Substituting Eq. 1 into Eq. 11, we can obtain the revised sub-problem overλas λ ADY G m b +α T λ np Σ n i1 q i +α i λ i λ i 14 1 where q i denotes theith element of q diagqq T [ q 1,..., q mb ] T,Q ADY G/ n p. Sinceλ, the solution over λ is λ new q i i. 16 α i 3. More Experimental Results In this section, we provide more experimental results on the orthogonal transformation based HCS and dictionary based HCS Performance Evaluation on Orthogonal Transformation Based HCS Given Haar wavelet as linear basis matrix D, we conduct [9], St [], [6], FL [1], RCS [4] and the proposed to reconstruct the HSI from the linear measurementsg, which is corrupted with additive Gaussian white noise. The HSI comes from three real hyperspectral datasets, namely INDIANA, URBAN and PAVIAU. Figure 1 and Figure give the visual reconstruction results of all competing methods with sampling rate ρ. when SNR of measurements is db anddb. The results in Figure 1 and Figure indicate that the proposed has the most approximate visual reconstruction results among all methods. This conclusion is consistent with those in the main paper.
4 a b c d e f g Figure 1. Visual reconstruction results of the th band from INDIANA, the 6th band from URBAN and the 9th band from PAVIAU with sampling rate ρ. when SNR of measurements is db. All the comparison methods recover the HSI in a wavelet transform based HCS framework. a, b St, c Lasso, d FL, e RCS, f, g Original bands. a b c d e f g Figure. Reconstruction results of the th band from INDIANA, the 6th band from URBAN and the 9th band from PAVIAU with sampling rate ρ. when SNR of measurements is db. All the comparison methods recover the HSI in a wavelet transform based HCS framework. a, b St, c Lasso, d FL, e RCS, f, g Original bands. 3.. Performance Evaluation on Dictionary Based HCS In hyperspectral domain, the HSI can be linearly represented by a spectrum dictionary where each spectrum in the dictionary is termed as an endmember. Given endmembers [] extracted from each HSI by NFINDR [] as the dictionary D to transform the HSI into a sparse matrix Y, [9], St [], [6], [7] and [8] are adopted as the comparison methods to reconstruct HSI with sampling rate ρ ranging from.1 to.9. To simulate the random noise in compressive sensing procedure, we add Gaussian white noise to the measurements, which results in the SNR of measurements to bedb anddb. Similar to the main paper, INDIANA, URBAN and PAVIAU are employed as the test data. Three
5 St St St PSNRdb St PSNRdb St PSNRdb St a b c d e f Figure 3. The PSNR curves and SAM bar charts of different methods on three datasets under SNRdb. abc PNSR curves. def SAM bar charts. 4 4 SAMdegree St SAMdegree St SAMdegree St PSNRdb PSNRdb PSNRdb SAMdegree 4 3 SAMdegree 1 SAMdegree St St St a b c d e f Figure 4. The PSNR curves and SAM bar charts of different methods on three datasets under SNRdb. abc PNSR curves. def SAM bar charts Table 1. The SSIM scores of different methods on three datasets under SNR db. Results on INDIANA dataset ρ.1 ρ. ρ.3 ρ.4 ρ. ρ.6 ρ.7 ρ.8 ρ.9 [9] St [] [6] [7] [8] Results on URBAN dataset ρ.1 ρ. ρ.3 ρ.4 ρ. ρ.6 ρ.7 ρ.8 ρ.9 [9] St [] [6] [7] [8] Results on PAVIAU dataset ρ.1 ρ. ρ.3 ρ.4 ρ. ρ.6 ρ.7 ρ.8 ρ.9 [9] St [] [6] [7] [8] evaluation measures including PSNR, SAM and SSIM, are utilized to evaluate the reconstruction performance, which record the average performance of each method after Monte-Carlo runs, including PSNR, SAM and SSIM. Under two levels of noise, the results of PSNR and SAM on three datasets are shown in Figure 3 and Figure 4, respectively. It is clear that the proposed gives the highest PSNR among all comparison methods. The SAM values of the proposed are lower than those of other methods in most cases. Additionally, the SSIM scores on three datasets are given in Table 1 and Table. Under two different levels of noise, it can be seen that the proposed gives the highest SSIM scores in most cases. The results evaluated by three measures indicate that the proposed outperforms other methods on the reconstruction accuracy of HSI with the dictionary based sparsification strategy. References [1] S. D. Babacan, R. Molina, and A. K. Katsaggelos. Bayesian compressive sensing using laplace priors. Image Processing, IEEE Transactions on, 191:3 63,.
6 Table. The SSIM scores of different methods on three datasets under SNR db. Results on INDIANA dataset ρ.1 ρ. ρ.3 ρ.4 ρ. ρ.6 ρ.7 ρ.8 ρ.9 [9] St [] [6] [7] [8] Results on URBAN dataset ρ.1 ρ. ρ.3 ρ.4 ρ. ρ.6 ρ.7 ρ.8 ρ.9 [9] St [] [6] [7] [8] Results on PAVIAU dataset ρ.1 ρ. ρ.3 ρ.4 ρ. ρ.6 ρ.7 ρ.8 ρ.9 [9] St [] [6] [7] [8] a b c d e f g Figure. Reconstruction results of the th band from INDIANA, the 6th band from URBAN and the 9th band from PAVIAU with sampling rate ρ. when SNR of measurements is db. All the comparison methods recover the HSI in a dictionary based HCS framework. a, b St, c Lasso, d FL, e RCS, f, g Original bands. [] J. Bioucas-Dias and A. Plaza. Hyperspectral unmixing: Geometrical, statistical, and sparse regression-based approaches. In Proc. SPIE, volume 78, page 78A,. [3] S. Boyd and L. Vandenberghe. Convex optimization. Cambridge university press, 4. [4] E. J. Candes, M. B. Wakin, and S. P. Boyd. Enhancing sparsity by reweighted l1 imization. Journal of Fourier analysis and applications, 14-6:877 9, 8. [] D. L. Donoho, Y. Tsaig, I. Drori, and J.-L. Starck. Sparse solution of underdetered systems of linear equations by stagewise orthogonal matching pursuit. Information Theory, IEEE Transactions on, 8:94 111, 1. [6] B. Efron, T. Hastie, I. Johnstone, R. Tibshirani, et al. Least angle regression. The Annals of statistics, 3:47 499, 4. [7] C. Li, T. Sun, K. F. Kelly, and Y. Zhang. A compressive sensing and unmixing scheme for hyperspectral data processing. Image Processing, IEEE Transactions on, 13: 1, 1.
7 a b c d e f g Figure 6. Reconstruction results of the th band from INDIANA, the 6th band from URBAN and the 9th band from PAVIAU with sampling rate ρ. when the SNR of measurements is db. All the comparison methods recover the HSI in a dictionary based HCS framework. a, b St, c Lasso, d FL, e RCS, f, g Original bands. [8] G. Martin, J. M. Bioucas-Dias, and A. Plaza. Hyperspectral coded aperture hyca: A new technique for hyperspectral compressive sensing. In Signal Processing Conference EUSIPCO, 13 Proceedings of the 1st European, pages 1. IEEE, 13. [9] J. A. Tropp and A. C. Gilbert. Signal recovery from random measurements via orthogonal matching pursuit. Information Theory, IEEE Transactions on, 31: , 7. [] M. E. Winter. N-findr: an algorithm for fast autonomous spectral end-member deteration in hyperspectral data. In SPIE s International Symposium on Optical Science, Engineering, and Instrumentation, pages International Society for Optics and Photonics, 1999.
Sparsity in Underdetermined Systems
Sparsity in Underdetermined Systems Department of Statistics Stanford University August 19, 2005 Classical Linear Regression Problem X n y p n 1 > Given predictors and response, y Xβ ε = + ε N( 0, σ 2
More informationTree-Structured Compressive Sensing with Variational Bayesian Analysis
1 Tree-Structured Compressive Sensing with Variational Bayesian Analysis Lihan He, Haojun Chen and Lawrence Carin Department of Electrical and Computer Engineering Duke University Durham, NC 27708-0291
More informationScale Mixture Modeling of Priors for Sparse Signal Recovery
Scale Mixture Modeling of Priors for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Jason Palmer, Zhilin Zhang and Ritwik Giri Outline Outline Sparse
More informationGradient Descent with Sparsification: An iterative algorithm for sparse recovery with restricted isometry property
: An iterative algorithm for sparse recovery with restricted isometry property Rahul Garg grahul@us.ibm.com Rohit Khandekar rohitk@us.ibm.com IBM T. J. Watson Research Center, 0 Kitchawan Road, Route 34,
More informationBayesian Methods for Sparse Signal Recovery
Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Jason Palmer, Zhilin Zhang and Ritwik Giri Motivation Motivation Sparse Signal Recovery
More informationGREEDY SPARSE RECONSTRUCTION OF NON-NEGATIVE SIGNALS USING SYMMETRIC ALPHA-STABLE DISTRIBUTIONS
GREEDY SPARSE RECONSTRUCTION OF NON-NEGATIVE SIGNALS USING SYMMETRIC ALPHA-STABLE DISTRIBUTIONS George Tzagkarakis and Panagiotis Tsakalides Department of Computer Science, University of Crete & Institute
More informationGreedy Signal Recovery and Uniform Uncertainty Principles
Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles
More informationMotivation Sparse Signal Recovery is an interesting area with many potential applications. Methods developed for solving sparse signal recovery proble
Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Zhilin Zhang and Ritwik Giri Motivation Sparse Signal Recovery is an interesting
More informationEUSIPCO
EUSIPCO 013 1569746769 SUBSET PURSUIT FOR ANALYSIS DICTIONARY LEARNING Ye Zhang 1,, Haolong Wang 1, Tenglong Yu 1, Wenwu Wang 1 Department of Electronic and Information Engineering, Nanchang University,
More informationOn the Role of the Properties of the Nonzero Entries on Sparse Signal Recovery
On the Role of the Properties of the Nonzero Entries on Sparse Signal Recovery Yuzhe Jin and Bhaskar D. Rao Department of Electrical and Computer Engineering, University of California at San Diego, La
More informationElaine T. Hale, Wotao Yin, Yin Zhang
, Wotao Yin, Yin Zhang Department of Computational and Applied Mathematics Rice University McMaster University, ICCOPT II-MOPTA 2007 August 13, 2007 1 with Noise 2 3 4 1 with Noise 2 3 4 1 with Noise 2
More informationSparse regression. Optimization-Based Data Analysis. Carlos Fernandez-Granda
Sparse regression Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 3/28/2016 Regression Least-squares regression Example: Global warming Logistic
More informationTutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London
Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation Pier Luigi Dragotti Imperial College London Outline Part 1: Sparse Signal Representation ~90min Part 2: Sparse Sampling ~90min 2
More informationOrthogonal Matching Pursuit for Sparse Signal Recovery With Noise
Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published
More informationCSC 576: Variants of Sparse Learning
CSC 576: Variants of Sparse Learning Ji Liu Department of Computer Science, University of Rochester October 27, 205 Introduction Our previous note basically suggests using l norm to enforce sparsity in
More informationNoisy Signal Recovery via Iterative Reweighted L1-Minimization
Noisy Signal Recovery via Iterative Reweighted L1-Minimization Deanna Needell UC Davis / Stanford University Asilomar SSC, November 2009 Problem Background Setup 1 Suppose x is an unknown signal in R d.
More informationSignal Recovery from Permuted Observations
EE381V Course Project Signal Recovery from Permuted Observations 1 Problem Shanshan Wu (sw33323) May 8th, 2015 We start with the following problem: let s R n be an unknown n-dimensional real-valued signal,
More informationPre-weighted Matching Pursuit Algorithms for Sparse Recovery
Journal of Information & Computational Science 11:9 (214) 2933 2939 June 1, 214 Available at http://www.joics.com Pre-weighted Matching Pursuit Algorithms for Sparse Recovery Jingfei He, Guiling Sun, Jie
More informationCompressed Sensing with Quantized Measurements
Compressed Sensing with Quantized Measurements Argyrios Zymnis, Stephen Boyd, and Emmanuel Candès April 28, 2009 Abstract We consider the problem of estimating a sparse signal from a set of quantized,
More informationBayesian Compressive Sensing and Projection Optimization
Shihao Ji shji@ece.duke.edu Lawrence Carin lcarin@ece.duke.edu Department of Electrical and Computer Engineering, Duke University, Durham, NC 2778 USA Abstract This paper introduces a new problem for which
More informationAdaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries
Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries Jarvis Haupt University of Minnesota Department of Electrical and Computer Engineering Supported by Motivation New Agile Sensing
More informationAn Introduction to Sparse Approximation
An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,
More informationSOLVING NON-CONVEX LASSO TYPE PROBLEMS WITH DC PROGRAMMING. Gilles Gasso, Alain Rakotomamonjy and Stéphane Canu
SOLVING NON-CONVEX LASSO TYPE PROBLEMS WITH DC PROGRAMMING Gilles Gasso, Alain Rakotomamonjy and Stéphane Canu LITIS - EA 48 - INSA/Universite de Rouen Avenue de l Université - 768 Saint-Etienne du Rouvray
More informationA convex model for non-negative matrix factorization and dimensionality reduction on physical space
A convex model for non-negative matrix factorization and dimensionality reduction on physical space Ernie Esser Joint work with Michael Möller, Stan Osher, Guillermo Sapiro and Jack Xin University of California
More informationCOMPARATIVE ANALYSIS OF ORTHOGONAL MATCHING PURSUIT AND LEAST ANGLE REGRESSION
COMPARATIVE ANALYSIS OF ORTHOGONAL MATCHING PURSUIT AND LEAST ANGLE REGRESSION By Mazin Abdulrasool Hameed A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for
More informationRecent developments on sparse representation
Recent developments on sparse representation Zeng Tieyong Department of Mathematics, Hong Kong Baptist University Email: zeng@hkbu.edu.hk Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last
More informationAn Homotopy Algorithm for the Lasso with Online Observations
An Homotopy Algorithm for the Lasso with Online Observations Pierre J. Garrigues Department of EECS Redwood Center for Theoretical Neuroscience University of California Berkeley, CA 94720 garrigue@eecs.berkeley.edu
More informationSparse Solutions of an Undetermined Linear System
1 Sparse Solutions of an Undetermined Linear System Maddullah Almerdasy New York University Tandon School of Engineering arxiv:1702.07096v1 [math.oc] 23 Feb 2017 Abstract This work proposes a research
More informationEquivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 19, NO. 12, DECEMBER 2008 2009 Equivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation Yuanqing Li, Member, IEEE, Andrzej Cichocki,
More informationCompressed Sensing and Related Learning Problems
Compressed Sensing and Related Learning Problems Yingzhen Li Dept. of Mathematics, Sun Yat-sen University Advisor: Prof. Haizhang Zhang Advisor: Prof. Haizhang Zhang 1 / Overview Overview Background Compressed
More informationTRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS
TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS Martin Kleinsteuber and Simon Hawe Department of Electrical Engineering and Information Technology, Technische Universität München, München, Arcistraße
More informationAdaptive L p (0 <p<1) Regularization: Oracle Property and Applications
Adaptive L p (0
More informationA discretized Newton flow for time varying linear inverse problems
A discretized Newton flow for time varying linear inverse problems Martin Kleinsteuber and Simon Hawe Department of Electrical Engineering and Information Technology, Technische Universität München Arcisstrasse
More informationInvertible Nonlinear Dimensionality Reduction via Joint Dictionary Learning
Invertible Nonlinear Dimensionality Reduction via Joint Dictionary Learning Xian Wei, Martin Kleinsteuber, and Hao Shen Department of Electrical and Computer Engineering Technische Universität München,
More informationNecessary and Sufficient Conditions of Solution Uniqueness in 1-Norm Minimization
Noname manuscript No. (will be inserted by the editor) Necessary and Sufficient Conditions of Solution Uniqueness in 1-Norm Minimization Hui Zhang Wotao Yin Lizhi Cheng Received: / Accepted: Abstract This
More informationMinimax MMSE Estimator for Sparse System
Proceedings of the World Congress on Engineering and Computer Science 22 Vol I WCE 22, October 24-26, 22, San Francisco, USA Minimax MMSE Estimator for Sparse System Hongqing Liu, Mandar Chitre Abstract
More informationReduced Complexity Models in the Identification of Dynamical Networks: links with sparsification problems
Reduced Complexity Models in the Identification of Dynamical Networks: links with sparsification problems Donatello Materassi, Giacomo Innocenti, Laura Giarré December 15, 2009 Summary 1 Reconstruction
More informationSparse representation classification and positive L1 minimization
Sparse representation classification and positive L1 minimization Cencheng Shen Joint Work with Li Chen, Carey E. Priebe Applied Mathematics and Statistics Johns Hopkins University, August 5, 2014 Cencheng
More informationImage Denoising using Uniform Curvelet Transform and Complex Gaussian Scale Mixture
EE 5359 Multimedia Processing Project Report Image Denoising using Uniform Curvelet Transform and Complex Gaussian Scale Mixture By An Vo ISTRUCTOR: Dr. K. R. Rao Summer 008 Image Denoising using Uniform
More informationGauge optimization and duality
1 / 54 Gauge optimization and duality Junfeng Yang Department of Mathematics Nanjing University Joint with Shiqian Ma, CUHK September, 2015 2 / 54 Outline Introduction Duality Lagrange duality Fenchel
More informationBayesian Compressive Sensing Based on Sparsing Important Wavelet Coefficients
Journal of Information Hiding and Multimedia Signal Processing c 2017 ISSN 2073-4212 Ubiquitous International Volume 8, Number 1, January 2017 Bayesian Compressive Sensing Based on Sparsing Important Wavelet
More informationUNSUPERVISED NEIGHBOR DEPENDENT NONLINEAR UNMIXING
UNSUPERVISED NEIGHBOR DEPENDENT NONLINEAR UNMIXING Rita Ammanouil, André Ferrari, Cédric Richard Laboratoire Lagrange, Université de Nice Sophia-Antipolis, France {rita.ammanouil, andre.ferrari, cedric.richard}@unice.fr
More informationSparse Proteomics Analysis (SPA)
Sparse Proteomics Analysis (SPA) Toward a Mathematical Theory for Feature Selection from Forward Models Martin Genzel Technische Universität Berlin Winter School on Compressed Sensing December 5, 2015
More informationDoes Compressed Sensing have applications in Robust Statistics?
Does Compressed Sensing have applications in Robust Statistics? Salvador Flores December 1, 2014 Abstract The connections between robust linear regression and sparse reconstruction are brought to light.
More informationCompressive Sampling for Energy Efficient Event Detection
Compressive Sampling for Energy Efficient Event Detection Zainul Charbiwala, Younghun Kim, Sadaf Zahedi, Jonathan Friedman, and Mani B. Srivastava Physical Signal Sampling Processing Communication Detection
More informationSCIENCE CHINA Information Sciences. Received December 22, 2008; accepted February 26, 2009; published online May 8, 2010
. RESEARCH PAPERS. SCIENCE CHINA Information Sciences June 2010 Vol. 53 No. 6: 1159 1169 doi: 10.1007/s11432-010-0090-0 L 1/2 regularization XU ZongBen 1, ZHANG Hai 1,2, WANG Yao 1, CHANG XiangYu 1 & LIANG
More informationMisspecified Bayesian Cramer-Rao Bound for Sparse Bayesian Learning
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Misspecified Bayesian Cramer-Rao Bound for Sparse Bayesian Learning Pajovic, M. TR018-076 July 1, 018 Abstract We consider a misspecified Bayesian
More informationSCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD
EE-731: ADVANCED TOPICS IN DATA SCIENCES LABORATORY FOR INFORMATION AND INFERENCE SYSTEMS SPRING 2016 INSTRUCTOR: VOLKAN CEVHER SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD STRUCTURED SPARSITY
More informationTruncation Strategy of Tensor Compressive Sensing for Noisy Video Sequences
Journal of Information Hiding and Multimedia Signal Processing c 2016 ISSN 207-4212 Ubiquitous International Volume 7, Number 5, September 2016 Truncation Strategy of Tensor Compressive Sensing for Noisy
More informationGeneralized Orthogonal Matching Pursuit- A Review and Some
Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents
More informationSparse Bayesian Logistic Regression with Hierarchical Prior and Variational Inference
Sparse Bayesian Logistic Regression with Hierarchical Prior and Variational Inference Shunsuke Horii Waseda University s.horii@aoni.waseda.jp Abstract In this paper, we present a hierarchical model which
More informationOptimisation Combinatoire et Convexe.
Optimisation Combinatoire et Convexe. Low complexity models, l 1 penalties. A. d Aspremont. M1 ENS. 1/36 Today Sparsity, low complexity models. l 1 -recovery results: three approaches. Extensions: matrix
More informationApplication to Hyperspectral Imaging
Compressed Sensing of Low Complexity High Dimensional Data Application to Hyperspectral Imaging Kévin Degraux PhD Student, ICTEAM institute Université catholique de Louvain, Belgium 6 November, 2013 Hyperspectral
More informationHomotopy methods based on l 0 norm for the compressed sensing problem
Homotopy methods based on l 0 norm for the compressed sensing problem Wenxing Zhu, Zhengshan Dong Center for Discrete Mathematics and Theoretical Computer Science, Fuzhou University, Fuzhou 350108, China
More informationL-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise
L-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise Srdjan Stanković, Irena Orović and Moeness Amin 1 Abstract- A modification of standard
More informationTractable Upper Bounds on the Restricted Isometry Constant
Tractable Upper Bounds on the Restricted Isometry Constant Alex d Aspremont, Francis Bach, Laurent El Ghaoui Princeton University, École Normale Supérieure, U.C. Berkeley. Support from NSF, DHS and Google.
More informationDeterministic sampling masks and compressed sensing: Compensating for partial image loss at the pixel level
Deterministic sampling masks and compressed sensing: Compensating for partial image loss at the pixel level Alfredo Nava-Tudela Institute for Physical Science and Technology and Norbert Wiener Center,
More informationA new weighted l p -norm for sparse hyperspectral unmixing
A new weighted l p -norm for sparse hyperspectral unmixing Yaser Esmaeili Salehani, Saeed Gazor, Mohamed Cheriet Synchromedia Laboratory, École de technologie supérieure (ÉTS), Montreal, QC, Canada Department
More informationIEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 58, NO. 6, JUNE
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 58, NO. 6, JUNE 2010 2935 Variance-Component Based Sparse Signal Reconstruction and Model Selection Kun Qiu, Student Member, IEEE, and Aleksandar Dogandzic,
More informationMultipath Matching Pursuit
Multipath Matching Pursuit Submitted to IEEE trans. on Information theory Authors: S. Kwon, J. Wang, and B. Shim Presenter: Hwanchol Jang Multipath is investigated rather than a single path for a greedy
More informationInverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France
Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Structure of the tutorial Session 1: Introduction to inverse problems & sparse
More informationMatrix Rank Minimization with Applications
Matrix Rank Minimization with Applications Maryam Fazel Haitham Hindi Stephen Boyd Information Systems Lab Electrical Engineering Department Stanford University 8/2001 ACC 01 Outline Rank Minimization
More informationSPACE-TIME ADAPTIVE PROCESSING BASED ON WEIGHTED REGULARIZED SPARSE RECOVERY
Progress In Electromagnetics Research B, Vol. 42, 245 262, 212 SPACE-TIME ADAPTIVE PROCESSING BASED ON WEIGHTED REGULARIZED SPARSE RECOVERY Z. C. Yang *, X. Li, and H. Q. Wang Electronics Science and Engineering
More informationFast Hard Thresholding with Nesterov s Gradient Method
Fast Hard Thresholding with Nesterov s Gradient Method Volkan Cevher Idiap Research Institute Ecole Polytechnique Federale de ausanne volkan.cevher@epfl.ch Sina Jafarpour Department of Computer Science
More informationNoisy and Missing Data Regression: Distribution-Oblivious Support Recovery
: Distribution-Oblivious Support Recovery Yudong Chen Department of Electrical and Computer Engineering The University of Texas at Austin Austin, TX 7872 Constantine Caramanis Department of Electrical
More informationarxiv: v1 [cs.it] 21 Feb 2013
q-ary Compressive Sensing arxiv:30.568v [cs.it] Feb 03 Youssef Mroueh,, Lorenzo Rosasco, CBCL, CSAIL, Massachusetts Institute of Technology LCSL, Istituto Italiano di Tecnologia and IIT@MIT lab, Istituto
More informationA Note on the Complexity of L p Minimization
Mathematical Programming manuscript No. (will be inserted by the editor) A Note on the Complexity of L p Minimization Dongdong Ge Xiaoye Jiang Yinyu Ye Abstract We discuss the L p (0 p < 1) minimization
More informationIMA Preprint Series # 2276
UNIVERSAL PRIORS FOR SPARSE MODELING By Ignacio Ramírez Federico Lecumberry and Guillermo Sapiro IMA Preprint Series # 2276 ( August 2009 ) INSTITUTE FOR MATHEMATICS AND ITS APPLICATIONS UNIVERSITY OF
More informationSTA414/2104 Statistical Methods for Machine Learning II
STA414/2104 Statistical Methods for Machine Learning II Murat A. Erdogdu & David Duvenaud Department of Computer Science Department of Statistical Sciences Lecture 3 Slide credits: Russ Salakhutdinov Announcements
More informationPHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN
PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION A Thesis by MELTEM APAYDIN Submitted to the Office of Graduate and Professional Studies of Texas A&M University in partial fulfillment of the
More informationA new method on deterministic construction of the measurement matrix in compressed sensing
A new method on deterministic construction of the measurement matrix in compressed sensing Qun Mo 1 arxiv:1503.01250v1 [cs.it] 4 Mar 2015 Abstract Construction on the measurement matrix A is a central
More informationRecovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm
Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm J. K. Pant, W.-S. Lu, and A. Antoniou University of Victoria August 25, 2011 Compressive Sensing 1 University
More informationNear Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing
Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas M.Vidyasagar@utdallas.edu www.utdallas.edu/ m.vidyasagar
More informationLinear Regression with Strongly Correlated Designs Using Ordered Weigthed l 1
Linear Regression with Strongly Correlated Designs Using Ordered Weigthed l 1 ( OWL ) Regularization Mário A. T. Figueiredo Instituto de Telecomunicações and Instituto Superior Técnico, Universidade de
More informationRandomness-in-Structured Ensembles for Compressed Sensing of Images
Randomness-in-Structured Ensembles for Compressed Sensing of Images Abdolreza Abdolhosseini Moghadam Dep. of Electrical and Computer Engineering Michigan State University Email: abdolhos@msu.edu Hayder
More informationSimultaneous Multi-frame MAP Super-Resolution Video Enhancement using Spatio-temporal Priors
Simultaneous Multi-frame MAP Super-Resolution Video Enhancement using Spatio-temporal Priors Sean Borman and Robert L. Stevenson Department of Electrical Engineering, University of Notre Dame Notre Dame,
More informationNew Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit
New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence
More informationLearning with L q<1 vs L 1 -norm regularisation with exponentially many irrelevant features
Learning with L q
More informationLarge-Scale L1-Related Minimization in Compressive Sensing and Beyond
Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March
More informationITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION
ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion Israel Institute o technology Haia 3000, Israel * SIAM Conerence on Imaging Science
More informationOn High-Dimensional Cross-Validation
On High-Dimensional Cross-Validation BY WEI-CHENG HSIAO Institute of Statistical Science, Academia Sinica, 128 Academia Road, Section 2, Nankang, Taipei 11529, Taiwan hsiaowc@stat.sinica.edu.tw 5 WEI-YING
More informationNecessary and sufficient conditions of solution uniqueness in l 1 minimization
1 Necessary and sufficient conditions of solution uniqueness in l 1 minimization Hui Zhang, Wotao Yin, and Lizhi Cheng arxiv:1209.0652v2 [cs.it] 18 Sep 2012 Abstract This paper shows that the solutions
More informationMODIFIED DISTRIBUTED ITERATIVE HARD THRESHOLDING
MODIFIED DISTRIBUTED ITERATIVE HARD THRESHOLDING Puxiao Han, Ruixin Niu Virginia Commonwealth University Dept. of Electrical and Computer Engineering Richmond, VA, 23284, U.S.A. Email: {hanp, rniu}@vcu.edu
More informationCo-Prime Arrays and Difference Set Analysis
7 5th European Signal Processing Conference (EUSIPCO Co-Prime Arrays and Difference Set Analysis Usham V. Dias and Seshan Srirangarajan Department of Electrical Engineering Bharti School of Telecommunication
More informationBias-free Sparse Regression with Guaranteed Consistency
Bias-free Sparse Regression with Guaranteed Consistency Wotao Yin (UCLA Math) joint with: Stanley Osher, Ming Yan (UCLA) Feng Ruan, Jiechao Xiong, Yuan Yao (Peking U) UC Riverside, STATS Department March
More informationLecture 3: Compressive Classification
Lecture 3: Compressive Classification Richard Baraniuk Rice University dsp.rice.edu/cs Compressive Sampling Signal Sparsity wideband signal samples large Gabor (TF) coefficients Fourier matrix Compressive
More information2 Regularized Image Reconstruction for Compressive Imaging and Beyond
EE 367 / CS 448I Computational Imaging and Display Notes: Compressive Imaging and Regularized Image Reconstruction (lecture ) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement
More informationRecovery of Sparse Signals Using Multiple Orthogonal Least Squares
Recovery of Sparse Signals Using Multiple Orthogonal east Squares Jian Wang, Ping i Department of Statistics and Biostatistics arxiv:40.505v [stat.me] 9 Oct 04 Department of Computer Science Rutgers University
More informationAn Algorithm for Bayesian Variable Selection in High-dimensional Generalized Linear Models
Proceedings 59th ISI World Statistics Congress, 25-30 August 2013, Hong Kong (Session CPS023) p.3938 An Algorithm for Bayesian Variable Selection in High-dimensional Generalized Linear Models Vitara Pungpapong
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationSupplementary Material of A Novel Sparsity Measure for Tensor Recovery
Supplementary Material of A Novel Sparsity Measure for Tensor Recovery Qian Zhao 1,2 Deyu Meng 1,2 Xu Kong 3 Qi Xie 1,2 Wenfei Cao 1,2 Yao Wang 1,2 Zongben Xu 1,2 1 School of Mathematics and Statistics,
More informationParticle Filtered Modified-CS (PaFiMoCS) for tracking signal sequences
Particle Filtered Modified-CS (PaFiMoCS) for tracking signal sequences Samarjit Das and Namrata Vaswani Department of Electrical and Computer Engineering Iowa State University http://www.ece.iastate.edu/
More informationMultiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing
Multiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing ibeltal F. Alem, Daniel H. Chae, and Rodney A. Kennedy The Australian National
More informationWavelet Footprints: Theory, Algorithms, and Applications
1306 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 51, NO. 5, MAY 2003 Wavelet Footprints: Theory, Algorithms, and Applications Pier Luigi Dragotti, Member, IEEE, and Martin Vetterli, Fellow, IEEE Abstract
More informationMachine Learning for Big Data CSE547/STAT548, University of Washington Emily Fox February 4 th, Emily Fox 2014
Case Study 3: fmri Prediction Fused LASSO LARS Parallel LASSO Solvers Machine Learning for Big Data CSE547/STAT548, University of Washington Emily Fox February 4 th, 2014 Emily Fox 2014 1 LASSO Regression
More informationLecture Notes 9: Constrained Optimization
Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form
More informationA ROBUST TEST FOR NONLINEAR MIXTURE DETECTION IN HYPERSPECTRAL IMAGES
A ROBUST TEST FOR NONLINEAR MIXTURE DETECTION IN HYPERSPECTRAL IMAGES Y. Altmann, N. Dobigeon and J-Y. Tourneret University of Toulouse IRIT-ENSEEIHT Toulouse, France J.C.M. Bermudez Federal University
More informationInverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.
Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Overview of the course Introduction sparsity & data compression inverse problems
More informationDetecting Sparse Structures in Data in Sub-Linear Time: A group testing approach
Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach Boaz Nadler The Weizmann Institute of Science Israel Joint works with Inbal Horev, Ronen Basri, Meirav Galun and Ery Arias-Castro
More informationLeast Squares Regression
CIS 50: Machine Learning Spring 08: Lecture 4 Least Squares Regression Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may not cover all the
More information