Performance Analysis for Sparse Support Recovery

Size: px
Start display at page:

Download "Performance Analysis for Sparse Support Recovery"

Transcription

1 Performance Analysis for Sparse Support Recovery Gongguo Tang and Arye Nehorai ESE, Washington University April 21st 2009 Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

2 Outline 1 Background and Motivation 2 Research Overview 3 Mathematical Model 4 Theoretical Analysis 5 Conclusions 6 Future Work Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

3 Background and Motivation Background and Motivation Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

4 Background Basic Concepts and Notations Sparse signals refer to a set of signals that have only a few nonzero components under a common basis/dictionary. The set of indices corresponding to the nonzero components are called the support for the signal. If several sparse signals share a common support, we call them jointly sparse. Sparse signal support recovery aims at identifying the true support of jointly sparse signals through its noisy linear measurements. Suppose that S is an index set, then for x 2 F N a vector, x S denotes the vector formed by those components of x indicated by S; for A 2 F MN a matrix, A S denotes the matrix formed by those columns indicated by S. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

5 Background Review of Compressive Sensing Long-established paradigm for digital data acquisition sample data at Nyquist rate (2x bandwidth) compress data (signal-dependent, nonlinear) brick wall to resolution/performance This slide is adapted from R. Baraniuk, J. Romberg and M. Wakin s "Tutorial on Compressive Sensing". Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

6 "Why go to so much e ort to acquire all the data when most of what we get will be thrown away? Can t we just directly measure the part that won t end up being thrown away?" David L. Donoho Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

7 Background Review of Compressive Sensing Directly acquire compressed data Replace samples by more general measurements K < M N This slide is adapted from R. Baraniuk, J. Romberg and M. Wakin s "Tutorial on Compressive Sensing". Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

8 Background Review of Compressive Sensing When data is sparse/compressible, we can directly acquire a condensed representation with no/little information loss Random projection will work This slide is adapted from R. Baraniuk, J. Romberg and M. Wakin s "Tutorial on Compressive Sensing". Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

9 Background Previous Assumptions When there are measurement noises, there are di erent criteria for measuring the recovery performance various l p norms E kbx x k p, especially l 2 and l 1 predictive power (e.g., E ky byk 2 2, where by is the estimate of y based on bx 0 1 loss associated with the event of recovering the correct support S Assumptions on noise bounded noise sparse noise Gaussian noise Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

10 Background Previous Assumptions Assumptions on sparse signal deterministic with unknown support but known component values deterministic with unknown support and unknown component values random with unknown support Assumptions on measurement matrix standard Gaussian ensemble Bernoulli ensemble random but with a structure such as Toeplitz deterministic Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

11 Motivation Why Support Recovery? The support of a sparse signal has physical signi cance the timing of events the locations of objects or anomalies Compressive Radar Imaging Compressive Sensor Network the frequency components Compressive Spectrum Analysis the existence of certain substances such as chemicals and mrnas Compressed Sensing DNA Microarrays Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

12 Motivation Theoretical Consideration After the recovery of the support, the magnitudes of the nonzero components can be obtained by a solving a least-square problem Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

13 Motivation Other Applications leading to Support Recovery Consider parameter estimation problem associated with the following widely applied model, y (t) = A (θ) x (t) + w (t), t = 1,, T, where A (θ) = ϕ (θ 1 ) ϕ (θ 2 ) ϕ (θ K ) and θ 1, θ 2,, θ K are true parameters. In order to solve this problem, we sample the parameter space to θ 1, θ 2,, θ N and form à θ = ϕ θ 1 ϕ θ 2 ϕ θ N. De ne vector x (t) by setting its components to those of x (t) when their locations correspond to true parameters and zero otherwise. Then we have transformed a traditional parameter estimation problem to one of support recovery. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

14 Research Overview Research Overview Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

15 Research Overview Introduce hypothesis testing problems for sparse signal support recovery Derive an upper bound for the probability of error (PoE) for general measurement matrix Study the e ect of di erent parameters Analyze the PoE for multiple hypothesis testing and its implications for system design Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

16 Mathematical Model Mathematical Model Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

17 Mathematical Model Measurement Model We will focus on the following model: or in matrix form y (t) = Ax (t) + w (t), t = 1,, T, (1) Y = AX + W. Here we have x (t) 2 F N, w (t) 2 F M, y (t) 2 F M with F = R or C. X, W, Y are matrices with columns formed by fx(t)g T t=1, fw(t)gt t=1, fy(t)g T t=1 respectively. Our analysis involves a constant κ which is 1 2 for F = R and 1 for F = C. Generally M is the dimension of hardware while T is the number of time samples. Hence increasing M is more expensive. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

18 Mathematical Model Assumptions on Signal and Noise We have the following assumptions: fx(t)g T t=1 are jointly sparse signals with a common support S = supp (X). fx S (t)g T t=1 follow i.i.d. FN (0, I K). fw (t)g T t=1 follow i.i.d. FN (0, σ2 I M ) and are independent of fx(t)g T t=1. Note that the noise variance σ2 can be viewed as 1/SNR. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

19 Mathematical Model Assumptions on Measurement Matrix We consider two types of measurement matrices: 1 Non-degenerate measurement matrix: we say that a general measurement matrix A MN is non-degenerate if every M M submatrix of A is nonsingular. 2 Gaussian measurement matrix: The element of A, say, a ij follows i.i.d. FN (0, 1). Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

20 Mathematical Model Hypothesis Testing We focus on two hypothesis testing problem: 1 Binary hypothesis testing (BHT) with js 0 j = js 1 j: H0 : supp (X) = S 0 H 1 : supp (X) = S 1. 2 Multiple hypothesis testing (MHT) : 8 >< H 1 : supp (X) = S 1.. >: H L : supp (X) = S L where S i s are candidate supports with the same cardinality js i j = K. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

21 Mathematical Model Probability of Error Our aim is to calculate an accurate upper bound for the PoE and analyze the e ect of M, T, and noise variance σ 2. p err (A) = 1 2 Z H 1 Pr(YjH 0 )dy Z H 0 Pr(YjH 1 )dy for BHT and p err (A) = L i=1 Z 1 Pr(YjH i )dy L H j :j6=i for MHT. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

22 Theoretical Analysis Theoretical Analysis Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

23 Theoretical Analysis Optimal Decision Rule for BHT Y = AX + W The BHT problem is equivalent to deciding between two distributions of Y: YjH 0 FN M,T (0, Σ 0 I T ) or YjH 1 FN M,T (0, Σ 1 I T ), where Σ i = σ 2 I M + A Si A S i. With equal prior probabilities of S 0 and S 1, the optimal decision rule is given by the likelihood ratio test: H f (YjH 1 ) 1 h i H 1 R 1, tr Y Σ1 1 Σ0 1 Y Q T log jσ 0j f (YjH 0 ) jσ H 0 H 0 1 j Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

24 Theoretical Analysis Calculation of PoE for BHT Due to the symmetry of H 0 and H 1, we can just compute the probability of false alarm p FA = Pr fh 1 jh 0 g = Pr tr = Pr h Y Σ1 1 Σ0 1 i Y < T log jσ 0j h tr Z Σ 1/2 0 Σ1 1 Σ1/2 0 I M Z i jσ 1 j jh 0 < T log jσ 0j jσ 1 j jh 0, where Z = Σ 1/2 0 Y FN (0, I M I T ). We de ne H = Σ 1/2 0 Σ1 1 Σ1/2 0 with Σ i = A Si A S i + σ 2 I M, which is a fundamental matrix in our analysis. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

25 Theoretical Analysis Calculation of PoE for BHT Suppose the ordered eigenvalues of H are σ 1 < σ 2 < < σ k1 < 1 = 1 = = 1 < λ 1 < λ 2 < < λ k0., and H can be diagonalized by an orthogonal/unitary matrix Q. Then the transformation of Z = QN will give us p FA = Pr f k 0 i=1 (λ i 1) T t=1 jn it j 2 k 1 i=1 < T log jσ 0j jσ 1 j jh 0g (1 σ i ) T t=1 N (i+k0 )t 2 Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

26 Theoretical Analysis Eigenvalue Structure of H The eigenvalue structure of H, especially the eigenvalues that are greater than 1, determines the performance of measurement matrix A in distinguishing between di erent supports. We study the structure of H in a slightly general seting where the sizes of the two candidate supports might not be equal. Problem 1 How many eigenvalues of H are less than 1, greater than 1 and equal to 1? Is there a general rule? 2 Can we give tight lower bounds on the eigenvalues that are greater than 1? The bounds should have a nice distribution that can be handled easily. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

27 Theoretical Analysis Eigenvalue Structure of H M = 200, js 0 \ S 1 j = 20, js 0 ns 1 j = 80, js 1 ns 0 j = 60 and the elements of A are i.i.d. real Gaussian. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

28 Theoretical Analysis Eigenvalue Structure of H Note that js 1 ns 0 j = 60 eigenvalues of H are less than 1, js 0 ns 1 j = 80 greater than 1, and M (js 0 ns 1 j + js 1 ns 0 j) = 60 identical to 1. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

29 Theoretical Analysis Eigenvalue Structure of H Theorem Suppose k i = js 0 \ S 1 j, k 0 = js 0 ns 1 j, k 1 = js 1 ns 0 j and M > k 0 + k 1, for general non-degenerate measurement matrix, k 0 eigenvalues of matrix H are greater than 1, k 1 less than 1 and M (k 0 + k 1 ) equal to 1. q Note that from the bound we present later, k 0 i=1 λ i k 1 i=1 (1/σ i) determines the performance of the optimal BHT decision rule. Hence, generally and quite intuitively, the larger the di erence set S 0 S 1, the easier to distinguish between the two candidate supports. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

30 Theoretical Analysis Eigenvalue Structure of H Theorem For Gaussian measurement matrix, the sorted eigenvalues of H that are greater than 1 are lower bounded by those of I k0 + 1 σ 2 V with probability one, where V is a matrix obtained from measurement matrix A and V follows W k0 (I k0, 2κ (M k 1 k i )). We comment that generally the larger M k 1 k i = M js 1 j, the larger the eigenvalues of I k0 + 1 σ 2 V, and hence the better we can distinguish the true support from the false one. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

31 Theoretical Analysis A Lower Bound on Eigenvalues M = 200, js 0 \ S 1 j = 20, js 0 ns 1 j = 80, js 1 ns 0 j = 60, σ 2 = 4 and the element of A are i.i.d. real Gaussian. Blue line represents the true sorted eigenvalues of H that are greater than 1 and red line represents the lower bound. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

32 Theoretical Analysis Bound on PoE Theorem The Probability of False Alarm can by bounded by ( kd /2 kd /2 λ g (S 0, S 1 ) λ g (S 1, S 0 ) ) κt p FA = Pr (S 1 jh 0 ), 4 4 q where k d = js 0 ns 1 j, λ g (S 0, S 1 ) = k d k d j=1 λ j with λ j s the eigenvalues of 1/2 1 1/2that H = A S0 A + σ 2 I S0 M A S1 A + σ 2 I S1 M A S0 A S 0 + σ 2 I M are greater than one. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

33 Theoretical Analysis Implications of the Bound The bound can be equivalently written as q! κkd k d i=1 λ i k T d i=1 (1/σ i ) 4 with λ i s and σ i s eigenvalues of H that are greater and less than 1, respectively. Hence these eigenvalues determines the systems ability in distinguishing two supports. As we will see the minimum of all λ g S i, S j s determines the systems ability in distinguishing all candidate supports, and can be viewed as a measure of incoherence. The logarithm of the bound can be approximated by κk d T 1 2 log λ g (S 0, S 1 ) λ g (S 1, S 0 ) log 4. Hence, if we can guarantee that λ g (S 0, S 1 ) λ g (S 1, S 0 ) of our measurement matrix is greater than some constant, then we can make the p FA arbitrarily small by taking more temporal samples. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

34 Theoretical Analysis Multiple Hypothesis Testing Now we turn to the MHT problem 8 >< H 1 : supp (X) = S 1.. >: H L : supp (X) = S L where S i s are candidate supports with the same cardinality js i j = K and L = C K N, the total number of candidate supports with size K. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

35 Theoretical Analysis PoE for MHT Theorem Denote by λ min = min λ g bounded by S i, S j, then the total PoE for MHT can be n p err C exp κt hlog λ min io log (4K (N K)) κt 1. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

36 Theoretical Analysis Multiple Hypothesis Testing Theorem For T = O log N log[k log N K ] as N, K, M!. and M = O(K log (N/K)), then n o Pr λ min > 4 [K (N K)] κt 1! 1, Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

37 Theoretical Analysis Discussion M = O(K log (N/K)) is the same as conventional compressive sensing. We need MT samples in total. When K is su ciently small compared with N, this value is still much smaller than N. Actually the value of T is not very large. For example, for N = , K = 10 5 log N, we have log[k log N K ] 13; for N = 10100, K = 10 98, we have log N log[k log N K ] 1; After we recover the support, we can get the component values by solving a least-square problem. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

38 Theoretical Analysis Implications of the Bound In practice, given N, K, we take M = O(K log (N/K)), T = O log N log[k log N K ] and generate measurement matrix A. Then with large probability, we will get λ min > 4 [K (N K)] 1 κt. For safety, we can compute λ min nd T large enough such that λ min > 4 [K (N K)] 1 κt continue to increase T so that p err < α. Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

39 Conclusions Hypothesis testing for sparse signal support recovery BHT MHT Bound for PoE non-degenerate measurement matrix The behavior of critical quantity Implications in system design Another dimension of data collection gives us more exibility Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

40 Future Work Design measurement system with optimal λ min. Establish a necessary condition imposed on M and T Analyze the behavior of λ (S 0, S 1 ) and λ min for other measurement matrix structures. Devise an e cient algorithm for support recovery and compare its performance with the optimal one The performance of l 1 minimization algorithm Develop an algorithm to compute λ min for given measurement matrix Explore the relationship between λ min and Restricted Isometry Property (RIP). Apply this result to the design of transmitted signals in Compressive Radar Imaging Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

41 Thank you! Gongguo Tang and Arye Nehorai (Institute) Performance Analysis for Sparse Support Recovery April 21st / 41

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk Model-Based Compressive Sensing for Signal Ensembles Marco F. Duarte Volkan Cevher Richard G. Baraniuk Concise Signal Structure Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional

More information

Compressive Sensing and Beyond

Compressive Sensing and Beyond Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered

More information

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Jorge F. Silva and Eduardo Pavez Department of Electrical Engineering Information and Decision Systems Group Universidad

More information

ACCORDING to Shannon s sampling theorem, an analog

ACCORDING to Shannon s sampling theorem, an analog 554 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL 59, NO 2, FEBRUARY 2011 Segmented Compressed Sampling for Analog-to-Information Conversion: Method and Performance Analysis Omid Taheri, Student Member,

More information

COMPRESSIVE SAMPLING USING EM ALGORITHM. Technical Report No: ASU/2014/4

COMPRESSIVE SAMPLING USING EM ALGORITHM. Technical Report No: ASU/2014/4 COMPRESSIVE SAMPLING USING EM ALGORITHM ATANU KUMAR GHOSH, ARNAB CHAKRABORTY Technical Report No: ASU/2014/4 Date: 29 th April, 2014 Applied Statistics Unit Indian Statistical Institute Kolkata- 700018

More information

CS on CS: Computer Science insights into Compresive Sensing (and vice versa) Piotr Indyk MIT

CS on CS: Computer Science insights into Compresive Sensing (and vice versa) Piotr Indyk MIT CS on CS: Computer Science insights into Compresive Sensing (and vice versa) Piotr Indyk MIT Sparse Approximations Goal: approximate a highdimensional vector x by x that is sparse, i.e., has few nonzero

More information

Computable Performance Analysis of Sparsity Recovery with Applications

Computable Performance Analysis of Sparsity Recovery with Applications Computable Performance Analysis of Sparsity Recovery with Applications Arye Nehorai Preston M. Green Department of Electrical & Systems Engineering Washington University in St. Louis, USA European Signal

More information

Recent Developments in Compressed Sensing

Recent Developments in Compressed Sensing Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline

More information

Constructing Explicit RIP Matrices and the Square-Root Bottleneck

Constructing Explicit RIP Matrices and the Square-Root Bottleneck Constructing Explicit RIP Matrices and the Square-Root Bottleneck Ryan Cinoman July 18, 2018 Ryan Cinoman Constructing Explicit RIP Matrices July 18, 2018 1 / 36 Outline 1 Introduction 2 Restricted Isometry

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

Compressed Sensing and Sparse Recovery

Compressed Sensing and Sparse Recovery ELE 538B: Sparsity, Structure and Inference Compressed Sensing and Sparse Recovery Yuxin Chen Princeton University, Spring 217 Outline Restricted isometry property (RIP) A RIPless theory Compressed sensing

More information

Multipath Matching Pursuit

Multipath Matching Pursuit Multipath Matching Pursuit Submitted to IEEE trans. on Information theory Authors: S. Kwon, J. Wang, and B. Shim Presenter: Hwanchol Jang Multipath is investigated rather than a single path for a greedy

More information

Limitations in Approximating RIP

Limitations in Approximating RIP Alok Puranik Mentor: Adrian Vladu Fifth Annual PRIMES Conference, 2015 Outline 1 Background The Problem Motivation Construction Certification 2 Planted model Planting eigenvalues Analysis Distinguishing

More information

AN INTRODUCTION TO COMPRESSIVE SENSING

AN INTRODUCTION TO COMPRESSIVE SENSING AN INTRODUCTION TO COMPRESSIVE SENSING Rodrigo B. Platte School of Mathematical and Statistical Sciences APM/EEE598 Reverse Engineering of Complex Dynamical Networks OUTLINE 1 INTRODUCTION 2 INCOHERENCE

More information

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence

More information

Estimating Unknown Sparsity in Compressed Sensing

Estimating Unknown Sparsity in Compressed Sensing Estimating Unknown Sparsity in Compressed Sensing Miles Lopes UC Berkeley Department of Statistics CSGF Program Review July 16, 2014 early version published at ICML 2013 Miles Lopes ( UC Berkeley ) estimating

More information

Linear Sketches A Useful Tool in Streaming and Compressive Sensing

Linear Sketches A Useful Tool in Streaming and Compressive Sensing Linear Sketches A Useful Tool in Streaming and Compressive Sensing Qin Zhang 1-1 Linear sketch Random linear projection M : R n R k that preserves properties of any v R n with high prob. where k n. M =

More information

Compressive Sensing with Random Matrices

Compressive Sensing with Random Matrices Compressive Sensing with Random Matrices Lucas Connell University of Georgia 9 November 017 Lucas Connell (University of Georgia) Compressive Sensing with Random Matrices 9 November 017 1 / 18 Overview

More information

Sparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing

Sparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing Sparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing Bobak Nazer and Robert D. Nowak University of Wisconsin, Madison Allerton 10/01/10 Motivation: Virus-Host Interaction

More information

Sparse Sensing in Colocated MIMO Radar: A Matrix Completion Approach

Sparse Sensing in Colocated MIMO Radar: A Matrix Completion Approach Sparse Sensing in Colocated MIMO Radar: A Matrix Completion Approach Athina P. Petropulu Department of Electrical and Computer Engineering Rutgers, the State University of New Jersey Acknowledgments Shunqiao

More information

STAT 200C: High-dimensional Statistics

STAT 200C: High-dimensional Statistics STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 57 Table of Contents 1 Sparse linear models Basis Pursuit and restricted null space property Sufficient conditions for RNS 2 / 57

More information

The Pros and Cons of Compressive Sensing

The Pros and Cons of Compressive Sensing The Pros and Cons of Compressive Sensing Mark A. Davenport Stanford University Department of Statistics Compressive Sensing Replace samples with general linear measurements measurements sampled signal

More information

Exponential decay of reconstruction error from binary measurements of sparse signals

Exponential decay of reconstruction error from binary measurements of sparse signals Exponential decay of reconstruction error from binary measurements of sparse signals Deanna Needell Joint work with R. Baraniuk, S. Foucart, Y. Plan, and M. Wootters Outline Introduction Mathematical Formulation

More information

Part IV Compressed Sensing

Part IV Compressed Sensing Aisenstadt Chair Course CRM September 2009 Part IV Compressed Sensing Stéphane Mallat Centre de Mathématiques Appliquées Ecole Polytechnique Conclusion to Super-Resolution Sparse super-resolution is sometime

More information

Compressed Sensing: Lecture I. Ronald DeVore

Compressed Sensing: Lecture I. Ronald DeVore Compressed Sensing: Lecture I Ronald DeVore Motivation Compressed Sensing is a new paradigm for signal/image/function acquisition Motivation Compressed Sensing is a new paradigm for signal/image/function

More information

Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary

Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary ECE 830 Spring 207 Instructor: R. Willett Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics Executive summary In the last lecture we saw that the likelihood

More information

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas M.Vidyasagar@utdallas.edu www.utdallas.edu/ m.vidyasagar

More information

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin

More information

Compressed Sensing Using Reed- Solomon and Q-Ary LDPC Codes

Compressed Sensing Using Reed- Solomon and Q-Ary LDPC Codes Compressed Sensing Using Reed- Solomon and Q-Ary LDPC Codes Item Type text; Proceedings Authors Jagiello, Kristin M. Publisher International Foundation for Telemetering Journal International Telemetering

More information

Strengthened Sobolev inequalities for a random subspace of functions

Strengthened Sobolev inequalities for a random subspace of functions Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)

More information

Massive MIMO: Signal Structure, Efficient Processing, and Open Problems II

Massive MIMO: Signal Structure, Efficient Processing, and Open Problems II Massive MIMO: Signal Structure, Efficient Processing, and Open Problems II Mahdi Barzegar Communications and Information Theory Group (CommIT) Technische Universität Berlin Heisenberg Communications and

More information

Tractable Upper Bounds on the Restricted Isometry Constant

Tractable Upper Bounds on the Restricted Isometry Constant Tractable Upper Bounds on the Restricted Isometry Constant Alex d Aspremont, Francis Bach, Laurent El Ghaoui Princeton University, École Normale Supérieure, U.C. Berkeley. Support from NSF, DHS and Google.

More information

Lecture 8: Signal Detection and Noise Assumption

Lecture 8: Signal Detection and Noise Assumption ECE 830 Fall 0 Statistical Signal Processing instructor: R. Nowak Lecture 8: Signal Detection and Noise Assumption Signal Detection : X = W H : X = S + W where W N(0, σ I n n and S = [s, s,..., s n ] T

More information

Improving Approximate Message Passing Recovery of Sparse Binary Vectors by Post Processing

Improving Approximate Message Passing Recovery of Sparse Binary Vectors by Post Processing 10th International ITG Conference on Systems, Communications and Coding (SCC 2015) Improving Approximate Message Passing Recovery of Sparse Binary Vectors by Post Processing Martin Mayer and Norbert Goertz

More information

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional

More information

Lecture 22: More On Compressed Sensing

Lecture 22: More On Compressed Sensing Lecture 22: More On Compressed Sensing Scribed by Eric Lee, Chengrun Yang, and Sebastian Ament Nov. 2, 207 Recap and Introduction Basis pursuit was the method of recovering the sparsest solution to an

More information

Compressed Sensing Using Bernoulli Measurement Matrices

Compressed Sensing Using Bernoulli Measurement Matrices ITSchool 11, Austin Compressed Sensing Using Bernoulli Measurement Matrices Yuhan Zhou Advisor: Wei Yu Department of Electrical and Computer Engineering University of Toronto, Canada Motivation Motivation

More information

Compressive sensing of low-complexity signals: theory, algorithms and extensions

Compressive sensing of low-complexity signals: theory, algorithms and extensions Compressive sensing of low-complexity signals: theory, algorithms and extensions Laurent Jacques March 7, 9, 1, 14, 16 and 18, 216 9h3-12h3 (incl. 3 ) Graduate School in Systems, Optimization, Control

More information

Compressed sensing techniques for hyperspectral image recovery

Compressed sensing techniques for hyperspectral image recovery Compressed sensing techniques for hyperspectral image recovery A. Abrardo, M. Barni, C. M. Carretti, E. Magli, S. Kuiteing Kamdem, R. Vitulli ABSTRACT Compressed Sensing (CS) theory is progressively gaining

More information

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear

More information

Randomness-in-Structured Ensembles for Compressed Sensing of Images

Randomness-in-Structured Ensembles for Compressed Sensing of Images Randomness-in-Structured Ensembles for Compressed Sensing of Images Abdolreza Abdolhosseini Moghadam Dep. of Electrical and Computer Engineering Michigan State University Email: abdolhos@msu.edu Hayder

More information

Sparse Legendre expansions via l 1 minimization

Sparse Legendre expansions via l 1 minimization Sparse Legendre expansions via l 1 minimization Rachel Ward, Courant Institute, NYU Joint work with Holger Rauhut, Hausdorff Center for Mathematics, Bonn, Germany. June 8, 2010 Outline Sparse recovery

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER On the Performance of Sparse Recovery

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER On the Performance of Sparse Recovery IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER 2011 7255 On the Performance of Sparse Recovery Via `p-minimization (0 p 1) Meng Wang, Student Member, IEEE, Weiyu Xu, and Ao Tang, Senior

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 3: Sparse signal recovery: A RIPless analysis of l 1 minimization Yuejie Chi The Ohio State University Page 1 Outline

More information

Sparse Optimization Lecture: Sparse Recovery Guarantees

Sparse Optimization Lecture: Sparse Recovery Guarantees Those who complete this lecture will know Sparse Optimization Lecture: Sparse Recovery Guarantees Sparse Optimization Lecture: Sparse Recovery Guarantees Instructor: Wotao Yin Department of Mathematics,

More information

The Pros and Cons of Compressive Sensing

The Pros and Cons of Compressive Sensing The Pros and Cons of Compressive Sensing Mark A. Davenport Stanford University Department of Statistics Compressive Sensing Replace samples with general linear measurements measurements sampled signal

More information

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016 Lecture notes 5 February 9, 016 1 Introduction Random projections Random projections are a useful tool in the analysis and processing of high-dimensional data. We will analyze two applications that use

More information

Signal Processing with Compressive Measurements

Signal Processing with Compressive Measurements MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Signal Processing with Compressive Measurements Mark Davenport, Petros Boufounos, Michael Wakin, Richard Baraniuk TR010-00 February 010 Abstract

More information

2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)?

2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)? ECE 830 / CS 76 Spring 06 Instructors: R. Willett & R. Nowak Lecture 3: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics Executive summary In the last lecture we

More information

Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries

Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries Jarvis Haupt University of Minnesota Department of Electrical and Computer Engineering Supported by Motivation New Agile Sensing

More information

A Generalized Restricted Isometry Property

A Generalized Restricted Isometry Property 1 A Generalized Restricted Isometry Property Jarvis Haupt and Robert Nowak Department of Electrical and Computer Engineering, University of Wisconsin Madison University of Wisconsin Technical Report ECE-07-1

More information

Tutorial: Sparse Signal Recovery

Tutorial: Sparse Signal Recovery Tutorial: Sparse Signal Recovery Anna C. Gilbert Department of Mathematics University of Michigan (Sparse) Signal recovery problem signal or population length N k important Φ x = y measurements or tests:

More information

Solution Recovery via L1 minimization: What are possible and Why?

Solution Recovery via L1 minimization: What are possible and Why? Solution Recovery via L1 minimization: What are possible and Why? Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Eighth US-Mexico Workshop on Optimization

More information

The Fundamentals of Compressive Sensing

The Fundamentals of Compressive Sensing The Fundamentals of Compressive Sensing Mark A. Davenport Georgia Institute of Technology School of Electrical and Computer Engineering Sensor Explosion Data Deluge Digital Revolution If we sample a signal

More information

Reconstruction from Anisotropic Random Measurements

Reconstruction from Anisotropic Random Measurements Reconstruction from Anisotropic Random Measurements Mark Rudelson and Shuheng Zhou The University of Michigan, Ann Arbor Coding, Complexity, and Sparsity Workshop, 013 Ann Arbor, Michigan August 7, 013

More information

A new method on deterministic construction of the measurement matrix in compressed sensing

A new method on deterministic construction of the measurement matrix in compressed sensing A new method on deterministic construction of the measurement matrix in compressed sensing Qun Mo 1 arxiv:1503.01250v1 [cs.it] 4 Mar 2015 Abstract Construction on the measurement matrix A is a central

More information

Compressive sensing in the analog world

Compressive sensing in the analog world Compressive sensing in the analog world Mark A. Davenport Georgia Institute of Technology School of Electrical and Computer Engineering Compressive Sensing A D -sparse Can we really acquire analog signals

More information

Co-Prime Arrays and Difference Set Analysis

Co-Prime Arrays and Difference Set Analysis 7 5th European Signal Processing Conference (EUSIPCO Co-Prime Arrays and Difference Set Analysis Usham V. Dias and Seshan Srirangarajan Department of Electrical Engineering Bharti School of Telecommunication

More information

Spin Glass Approach to Restricted Isometry Constant

Spin Glass Approach to Restricted Isometry Constant Spin Glass Approach to Restricted Isometry Constant Ayaka Sakata 1,Yoshiyuki Kabashima 2 1 Institute of Statistical Mathematics 2 Tokyo Institute of Technology 1/29 Outline Background: Compressed sensing

More information

Lecture Notes 9: Constrained Optimization

Lecture Notes 9: Constrained Optimization Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form

More information

Analog-to-Information Conversion

Analog-to-Information Conversion Analog-to-Information Conversion Sergiy A. Vorobyov Dept. Signal Processing and Acoustics, Aalto University February 2013 Winter School on Compressed Sensing, Ruka 1/55 Outline 1 Compressed Sampling (CS)

More information

Optimization for Compressed Sensing

Optimization for Compressed Sensing Optimization for Compressed Sensing Robert J. Vanderbei 2014 March 21 Dept. of Industrial & Systems Engineering University of Florida http://www.princeton.edu/ rvdb Lasso Regression The problem is to solve

More information

COMPRESSED SENSING IN PYTHON

COMPRESSED SENSING IN PYTHON COMPRESSED SENSING IN PYTHON Sercan Yıldız syildiz@samsi.info February 27, 2017 OUTLINE A BRIEF INTRODUCTION TO COMPRESSED SENSING A BRIEF INTRODUCTION TO CVXOPT EXAMPLES A Brief Introduction to Compressed

More information

Sparsity Regularization

Sparsity Regularization Sparsity Regularization Bangti Jin Course Inverse Problems & Imaging 1 / 41 Outline 1 Motivation: sparsity? 2 Mathematical preliminaries 3 l 1 solvers 2 / 41 problem setup finite-dimensional formulation

More information

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method CS 395T: Sublinear Algorithms Fall 2016 Prof. Eric Price Lecture 13 October 6, 2016 Scribe: Kiyeon Jeon and Loc Hoang 1 Overview In the last lecture we covered the lower bound for p th moment (p > 2) and

More information

Compressed Sensing. 1 Introduction. 2 Design of Measurement Matrices

Compressed Sensing. 1 Introduction. 2 Design of Measurement Matrices Compressed Sensing Yonina C. Eldar Electrical Engineering Department, Technion-Israel Institute of Technology, Haifa, Israel, 32000 1 Introduction Compressed sensing (CS) is an exciting, rapidly growing

More information

Large-Scale Multiple Testing in Medical Informatics

Large-Scale Multiple Testing in Medical Informatics Large-Scale Multiple Testing in Medical Informatics experiment space model space data IMA Workshop on Large Data Sets in Medical Informatics November 16 2011 Rob Nowak www.ece.wisc.edu/~nowak Joint work

More information

Low-Dimensional Signal Models in Compressive Sensing

Low-Dimensional Signal Models in Compressive Sensing University of Colorado, Boulder CU Scholar Electrical, Computer & Energy Engineering Graduate Theses & Dissertations Electrical, Computer & Energy Engineering Spring 4-1-2013 Low-Dimensional Signal Models

More information

ABSTRACT. Topics on LASSO and Approximate Message Passing. Ali Mousavi

ABSTRACT. Topics on LASSO and Approximate Message Passing. Ali Mousavi ABSTRACT Topics on LASSO and Approximate Message Passing by Ali Mousavi This thesis studies the performance of the LASSO (also known as basis pursuit denoising) for recovering sparse signals from undersampled,

More information

Observability with Random Observations

Observability with Random Observations Observability with Random Observations 1 Borhan M. Sanandaji, Michael B. Wakin, and Tyrone L. Vincent Abstract Recovery of the initial state of a high-dimensional system can require a large number of measurements.

More information

Inverse problems, Dictionary based Signal Models and Compressed Sensing

Inverse problems, Dictionary based Signal Models and Compressed Sensing Inverse problems, Dictionary based Signal Models and Compressed Sensing Rémi Gribonval METISS project-team (audio signal processing, speech recognition, source separation) INRIA, Rennes, France Ecole d

More information

A Survey of Compressive Sensing and Applications

A Survey of Compressive Sensing and Applications A Survey of Compressive Sensing and Applications Justin Romberg Georgia Tech, School of ECE ENS Winter School January 10, 2012 Lyon, France Signal processing trends DSP: sample first, ask questions later

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 26, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 55 High dimensional

More information

IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER

IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER 2015 1239 Preconditioning for Underdetermined Linear Systems with Sparse Solutions Evaggelia Tsiligianni, StudentMember,IEEE, Lisimachos P. Kondi,

More information

Enhanced Compressive Sensing and More

Enhanced Compressive Sensing and More Enhanced Compressive Sensing and More Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Nonlinear Approximation Techniques Using L1 Texas A & M University

More information

An algebraic perspective on integer sparse recovery

An algebraic perspective on integer sparse recovery An algebraic perspective on integer sparse recovery Lenny Fukshansky Claremont McKenna College (joint work with Deanna Needell and Benny Sudakov) Combinatorics Seminar USC October 31, 2018 From Wikipedia:

More information

Shannon-Theoretic Limits on Noisy Compressive Sampling Mehmet Akçakaya, Student Member, IEEE, and Vahid Tarokh, Fellow, IEEE

Shannon-Theoretic Limits on Noisy Compressive Sampling Mehmet Akçakaya, Student Member, IEEE, and Vahid Tarokh, Fellow, IEEE 492 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 1, JANUARY 2010 Shannon-Theoretic Limits on Noisy Compressive Sampling Mehmet Akçakaya, Student Member, IEEE, Vahid Tarokh, Fellow, IEEE Abstract

More information

Recognition of a code in a noisy environment

Recognition of a code in a noisy environment Introduction Recognition of a code Conclusion Université de Limoges - XLIM - DMI INRIA Rocquencourt - Projet CODES ISIT Nice - June 29, 2007 Introduction Recognition of a code Conclusion Outline 1 Introduction

More information

COMPRESSED Sensing (CS) has recently emerged as a. Sparse Recovery from Combined Fusion Frame Measurements

COMPRESSED Sensing (CS) has recently emerged as a. Sparse Recovery from Combined Fusion Frame Measurements IEEE TRANSACTION ON INFORMATION THEORY, VOL 57, NO 6, JUNE 2011 1 Sparse Recovery from Combined Fusion Frame Measurements Petros Boufounos, Member, IEEE, Gitta Kutyniok, and Holger Rauhut Abstract Sparse

More information

Combining geometry and combinatorics

Combining geometry and combinatorics Combining geometry and combinatorics A unified approach to sparse signal recovery Anna C. Gilbert University of Michigan joint work with R. Berinde (MIT), P. Indyk (MIT), H. Karloff (AT&T), M. Strauss

More information

Rigorous Dynamics and Consistent Estimation in Arbitrarily Conditioned Linear Systems

Rigorous Dynamics and Consistent Estimation in Arbitrarily Conditioned Linear Systems 1 Rigorous Dynamics and Consistent Estimation in Arbitrarily Conditioned Linear Systems Alyson K. Fletcher, Mojtaba Sahraee-Ardakan, Philip Schniter, and Sundeep Rangan Abstract arxiv:1706.06054v1 cs.it

More information

Sensing systems limited by constraints: physical size, time, cost, energy

Sensing systems limited by constraints: physical size, time, cost, energy Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original

More information

Detection Theory. Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010

Detection Theory. Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010 Detection Theory Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010 Outline Neyman-Pearson Theorem Detector Performance Irrelevant Data Minimum Probability of Error Bayes Risk Multiple

More information

ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals

ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals Jingxian Wu Department of Electrical Engineering University of Arkansas Outline Matched Filter Generalized Matched Filter Signal

More information

Deterministic constructions of compressed sensing matrices

Deterministic constructions of compressed sensing matrices Journal of Complexity 23 (2007) 918 925 www.elsevier.com/locate/jco Deterministic constructions of compressed sensing matrices Ronald A. DeVore Department of Mathematics, University of South Carolina,

More information

Constrained optimization

Constrained optimization Constrained optimization DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Compressed sensing Convex constrained

More information

Probabilistic Graphical Models

Probabilistic Graphical Models School of Computer Science Probabilistic Graphical Models Gaussian graphical models and Ising models: modeling networks Eric Xing Lecture 0, February 7, 04 Reading: See class website Eric Xing @ CMU, 005-04

More information

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28 Sparsity Models Tong Zhang Rutgers University T. Zhang (Rutgers) Sparsity Models 1 / 28 Topics Standard sparse regression model algorithms: convex relaxation and greedy algorithm sparse recovery analysis:

More information

Compressive Sampling for Energy Efficient Event Detection

Compressive Sampling for Energy Efficient Event Detection Compressive Sampling for Energy Efficient Event Detection Zainul Charbiwala, Younghun Kim, Sadaf Zahedi, Jonathan Friedman, and Mani B. Srivastava Physical Signal Sampling Processing Communication Detection

More information

Novel spectrum sensing schemes for Cognitive Radio Networks

Novel spectrum sensing schemes for Cognitive Radio Networks Novel spectrum sensing schemes for Cognitive Radio Networks Cantabria University Santander, May, 2015 Supélec, SCEE Rennes, France 1 The Advanced Signal Processing Group http://gtas.unican.es The Advanced

More information

Compressed Statistical Testing and Application to Radar

Compressed Statistical Testing and Application to Radar Compressed Statistical Testing and Application to Radar Hsieh-Chung Chen, H. T. Kung, and Michael C. Wicks Harvard University, Cambridge, MA, USA University of Dayton, Dayton, OH, USA Abstract We present

More information

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets J. Clayton Kerce a, George C. Brown a, and David F. Hardiman b a Georgia Tech Research Institute, Georgia Institute of Technology,

More information

Tutorial: Sparse Recovery Using Sparse Matrices. Piotr Indyk MIT

Tutorial: Sparse Recovery Using Sparse Matrices. Piotr Indyk MIT Tutorial: Sparse Recovery Using Sparse Matrices Piotr Indyk MIT Problem Formulation (approximation theory, learning Fourier coeffs, linear sketching, finite rate of innovation, compressed sensing...) Setup:

More information

Multiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing

Multiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing Multiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing ibeltal F. Alem, Daniel H. Chae, and Rodney A. Kennedy The Australian National

More information

Computer Vision Group Prof. Daniel Cremers. 6. Mixture Models and Expectation-Maximization

Computer Vision Group Prof. Daniel Cremers. 6. Mixture Models and Expectation-Maximization Prof. Daniel Cremers 6. Mixture Models and Expectation-Maximization Motivation Often the introduction of latent (unobserved) random variables into a model can help to express complex (marginal) distributions

More information

Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector

Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Muhammad Salman Asif Thesis Committee: Justin Romberg (Advisor), James McClellan, Russell Mersereau School of Electrical and Computer

More information

Recovery of Compressible Signals in Unions of Subspaces

Recovery of Compressible Signals in Unions of Subspaces 1 Recovery of Compressible Signals in Unions of Subspaces Marco F. Duarte, Chinmay Hegde, Volkan Cevher, and Richard G. Baraniuk Department of Electrical and Computer Engineering Rice University Abstract

More information

Numerical Linear Algebra

Numerical Linear Algebra Numerical Linear Algebra Direct Methods Philippe B. Laval KSU Fall 2017 Philippe B. Laval (KSU) Linear Systems: Direct Solution Methods Fall 2017 1 / 14 Introduction The solution of linear systems is one

More information