Sparsity and Compressed Sensing
|
|
- Victoria Lambert
- 5 years ago
- Views:
Transcription
1 Sparsity and Compressed Sensing Jalal Fadili Normandie Université-ENSICAEN, GREYC Mathematical coffees 2017
2 Recap: linear inverse problems Dictionary Sensing Sensing Sensing = y m 1 H m n y y 2 R m H A is fat (underdetered system). Solution is not unique (fundamental theorem of linear algebra). Are we stuck? No if the dimension of x is intrinsically small. A x 2 R n y m 1 m n n L m 1 H m n } x n 1 MC 17-2
3 Geometry of inverse problems x 0 2 R n MC 17-3
4 Geometry of inverse problems x 0 2 R n A Ax 0 2 R m MC 17-3
5 Geometry of inverse problems x 0 2 R n A Ax 0 2 R m MC 17-3
6 Geometry of inverse problems x 0 2 R n Objects on this set have a low cost A Ax 0 2 R m MC 17-3
7 Strong notion of sparsity R 2 R 3 MC 17-4
8 Strong notion of sparsity 0-sparse R 2 R 3 MC 17-4
9 Strong notion of sparsity 0-sparse 1-sparse R 2 R 3 MC 17-4
10 Strong notion of sparsity 0-sparse 1-sparse 2-sparse = dense R 2 R 3 MC 17-4
11 Strong notion of sparsity 0-sparse 1-sparse 2-sparse = dense 0-sparse R 2 R 3 MC 17-4
12 Strong notion of sparsity 0-sparse 1-sparse 2-sparse = dense 0-sparse 1-sparse R 2 R 3 MC 17-4
13 Strong notion of sparsity 0-sparse 1-sparse 2-sparse = dense 0-sparse 1-sparse 2-sparse R 2 R 3 MC 17-4
14 Strong notion of sparsity 0-sparse 1-sparse 2-sparse = dense 0-sparse 1-sparse 2-sparse R 2 R 3 MC 17-4
15 Strong notion of sparsity 0-sparse 1-sparse 2-sparse = dense 0-sparse 1-sparse 2-sparse R 2 R 3 MC 17-4
16 Strong notion of sparsity 0-sparse 1-sparse 2-sparse = dense 0-sparse 1-sparse 2-sparse 3-sparse = dense R 2 R 3 MC 17-4
17 Strong notion of sparsity 0-sparse 1-sparse 2-sparse = dense 0-sparse 1-sparse 2-sparse 3-sparse = dense R 2 R 3 supp(x) ={i =1,,n: x i 6=0} kxk =#supp(x) 0 (Not a norm : not positively homogenenous) MC 17-4
18 Strong notion of sparsity 0-sparse 1-sparse 2-sparse = dense 0-sparse 1-sparse 2-sparse 3-sparse = dense R 2 R 3 supp(x) ={i =1,,n: x i 6=0} kxk =#supp(x) 0 (Not a norm : not positively homogenenous) Definition (Informal) x 2 R n is sparse iff kxk 0 n. MC 17-4
19 Strong notion of sparsity 0-sparse 1-sparse 2-sparse = dense 0-sparse 1-sparse 2-sparse 3-sparse = dense R 2 R 3 supp(x) ={i =1,,n: x i 6=0} kxk =#supp(x) 0 (Not a norm : not positively homogenenous) Definition (Informal) x 2 R n is sparse iff kxk 0 n. Model of s-sparse vectors : a union of subspaces s = S i {V i = span ((e j ) 1applejapplen ): dim(v i )=s}. MC 17-4
20 Weak notion of sparsity In nature, signals, images, information, are not (strongly) sparse. MC 17-5
21 Weak notion of sparsity In nature, signals, images, information, are not (strongly) sparse. wavelet coefficients 0 white Wavelet transform 0 black MC 17-5
22 Weak notion of sparsity In nature, signals, images, information, are not (strongly) sparse. wavelet coefficients 0 white Wavelet transform 0 black sorted wavelet coefficients index MC 17-5
23 Weak notion of sparsity In nature, signals, images, information, are not (strongly) sparse. wavelet coefficients 0 white Wavelet transform 0 black sorted wavelet coefficients Definition (Informal) x 2 R n is weakly sparse iff x (i) decreases fast enough with i. index MC 17-5
24 From now on, sparsity is intended in strong sense MC 17-6
25 What sparsity good for? Solve y =Ax where x is sparse If kxk 0 apple m and A I is full-rank (I def =supp(x)), we are done. Indeed, at least as many equations as unknowns : In practice, the support I is not known. y = A I x I. We have to infer it from the sole knowledge of y and A. MC 17-7
26 Regularization Solve y =Ax where x is sparse MC 17-8
27 Regularization Solve y =Ax where x is sparse x2r n kxk 0 such that y =Ax MC 17-8
28 Regularization Solve y =Ax where x is sparse x2r n kxk 0 such that y =Ax m =2 n =3 MC 17-8
29 Regularization Solve y =Ax where x is sparse x2r n kxk 0 such that y =Ax m =2 n =3 y=ax MC 17-8
30 Regularization Solve y =Ax where x is sparse x2r n kxk 0 such that y =Ax m =2 n =3 1-sparse y=ax MC 17-8
31 Regularization Solve y =Ax where x is sparse x2r n kxk 0 such that y =Ax m =2 n =3 1-sparse 2-sparse y=ax MC 17-8
32 Regularization Solve y =Ax where x is sparse x2r n kxk 0 such that y =Ax m =2 n =3 Sparsest solution 1-sparse 2-sparse y=ax MC 17-8
33 Regularization Solve y =Ax where x is sparse x2r n kxk 0 such that y =Ax m =2 n =3 Sparsest solution 1-sparse 2-sparse y=ax Not convex, not even continuous. In fact, this is a combinatorial NP-hard problem. Can we find a viable alternative? MC 17-8
34 Relaxation Solve y =Ax where x is sparse MC 17-9
35 Relaxation Solve y =Ax where x is sparse x2r n kxk 0 s.t. y =Ax m =1 n =2 y=ax MC 17-9
36 Relaxation Solve y =Ax where x is sparse x2r n kxk 0 s.t. y =Ax m =1 n =2 Not convex. Not continuous. NP-hard. Sparsest solution. y=ax MC 17-9
37 Relaxation Solve y =Ax where x is sparse x2r n kxk 0 s.t. y =Ax x2r n kxk 0.5 s.t. y =Ax m =1 n =2 Not convex. Not continuous. NP-hard. Sparsest solution. y=ax y=ax kxk p =! 1/p nx x i p i=1 MC 17-9
38 Relaxation Solve y =Ax where x is sparse x2r n kxk 0 s.t. y =Ax x2r n kxk 0.5 s.t. y =Ax m =1 n =2 Not convex. Not continuous. NP-hard. Sparsest solution. y=ax Not convex. Continuous. Sparsest solution. y=ax kxk p =! 1/p nx x i p i=1 MC 17-9
39 Relaxation Solve y =Ax where x is sparse x2r n kxk 0 s.t. y =Ax x2r n kxk 0.5 s.t. y =Ax m =1 n =2 Not convex. Not continuous. NP-hard. Sparsest solution. y=ax Not convex. Continuous. Sparsest solution. y=ax x2r n kxk 2 s.t. y =Ax kxk p =! 1/p nx x i p i=1 y=ax MC 17-9
40 Relaxation Solve y =Ax where x is sparse x2r n kxk 0 s.t. y =Ax x2r n kxk 0.5 s.t. y =Ax m =1 n =2 Not convex. Not continuous. NP-hard. Sparsest solution. y=ax Not convex. Continuous. Sparsest solution. y=ax x2r n kxk 2 s.t. y =Ax kxk p =! 1/p nx x i p i=1 y=ax Continuous. Convex. Dense (LS) solution MC 17-9
41 Relaxation Solve y =Ax where x is sparse x2r n kxk 0 s.t. y =Ax x2r n kxk 0.5 s.t. y =Ax m =1 n =2 Not convex. Not continuous. NP-hard. Sparsest solution. y=ax Not convex. Continuous. Sparsest solution. y=ax x2r n kxk 2 s.t. y =Ax x2r n kxk 1.5 s.t. y =Ax kxk p =! 1/p nx x i p i=1 y=ax Continuous. Convex. Dense (LS) solution y=ax MC 17-9
42 Relaxation Solve y =Ax where x is sparse x2r n kxk 0 s.t. y =Ax x2r n kxk 0.5 s.t. y =Ax m =1 n =2 Not convex. Not continuous. NP-hard. Sparsest solution. y=ax Not convex. Continuous. Sparsest solution. y=ax x2r n kxk 2 s.t. y =Ax x2r n kxk 1.5 s.t. y =Ax kxk p =! 1/p nx x i p i=1 y=ax Continuous. Convex. Dense (LS) solution y=ax Continuous. Convex. Dense (LS) solution MC 17-9
43 Relaxation Solve y =Ax where x is sparse x2r n kxk 0 s.t. y =Ax Basis Pursuit kxk x2r n 1 s.t. y =Ax x2r n kxk 0.5 s.t. y =Ax m =1 n =2 Not convex. Not continuous. NP-hard. Sparsest solution. y=ax y=ax Not convex. Continuous. Sparsest solution. y=ax x2r n kxk 2 s.t. y =Ax x2r n kxk 1.5 s.t. y =Ax kxk p =! 1/p nx x i p i=1 y=ax Continuous. Convex. Dense (LS) solution y=ax Continuous. Convex. Dense (LS) solution MC 17-9
44 Relaxation Solve y =Ax where x is sparse x2r n kxk 0 s.t. y =Ax Basis Pursuit kxk x2r n 1 s.t. y =Ax x2r n kxk 0.5 s.t. y =Ax m =1 n =2 Not convex. Not continuous. NP-hard. Sparsest solution. y=ax Continuous. Convex. Sparsest solution. y=ax Not convex. Continuous. Sparsest solution. y=ax x2r n kxk 2 s.t. y =Ax x2r n kxk 1.5 s.t. y =Ax kxk p =! 1/p nx x i p i=1 y=ax Continuous. Convex. Dense (LS) solution y=ax Continuous. Convex. Dense (LS) solution MC 17-9
45 Tightest convex relaxation x2r n kxk 1 s.t. y =Ax (BP) MC 17-10
46 Tightest convex relaxation x2r n kxk 1 s.t. y =Ax (BP) MC 17-10
47 Tightest convex relaxation x2r n kxk 1 s.t. y =Ax (BP) B`1 = ConvHull (B`0 \ B`2) `1 is the tightest convex relaxation of `0 MC 17-10
48 Error correction problem Find x from v =Bw + e Error: small fraction of corruptions Corrupted ciphertext R n n>m Plaintext R m b 1 b 2 b m Codewords MC 17-11
49 Error correction problem Find x from v =Bw + e A such that span(b) ker(a), i.e. AB = 0. Multiply v by A : y =Av =Ae. Only a small fraction of corruptions means that e is sparse. The original problem can be cast as x? 2 Arg x2r n kxk 1 s.t. y =Ax w? =B + x? Question : when x? = e so that w? = w? (see following talks). MC 17-12
50 Optimization algorithms BP as a linear program : x2r n kxk 1 s.t. y =Ax (BP) Decompose x in its positive and negative part and lift in R 2n : z2r 2n 2nX i=1 z i s.t. y =[A A]z, z 0. Use your favourite LP solvers package : Cplex, Sedumi (IP), Mosek (IP), etc.. Recover x? =(z? i )n i=1 (z? i )2n i=n+1. High accuracy. Scaling with dimension n. Proximal splitting algorithms : DR, ADMM, Primal-Dual (MC of April 18th) : Scale well with dimension : cost/iteration = O(mn) vector/matrix multiplication and O(n) soft-thresholding. Iterative methods : less accurate. MC 17-13
51 Noiseless case y =Ax 0 : Recovery guarantees x2r n kxk 1 s.t. y =Ax (BP) When (BP) has a unique solution that is the sparsest vector x 0? Uniform guarantees : which conditions ensure recovery of all s-sparse signals? Non-uniform guarantees : which conditions ensure recovery of the s-sparse vector x 0? Sample complexity bounds (random settings) : can we constrict sensing matrices s.t. the above conditions hold? What are the optimal scalings of the problem dimensions (n, m, s)? Necessary conditions? What if x 0 is only weakly sparse? MC 17-14
52 Sensitivity/stability guarantees 1 x2r n 2 ky Axk2 2 + kxk 1, > 0 (BPDN/LASSO) Noisy case y =Ax 0 + " : Study stability of (BPDN) solution(s) to the noise "? `2 stability : Theorem (Typical statement) Under conditions XX, and choice any solution x? of (BPDN) obeys kx? x 0 k 2 apple C k"k 2. = c k"k 2, there exists C such that MC 17-15
53 Sensitivity/stability guarantees 1 x2r n 2 ky Axk2 2 + kxk 1, > 0 (BPDN/LASSO) Noisy case y =Ax 0 + " : Study stability of (BPDN) solution(s) to the noise "? `2 stability : Theorem (Typical statement) Under conditions XX, and choice any solution x? of (BPDN) obeys kx? x 0 k 2 apple C k"k 2. Support and sign stability (more stringent) : Theorem (Typical statement) Under conditions XXXX, and choice unique solution x? of (BPDN) obeys = c k"k 2, there exists C such that = f(k"k 2, i2supp(x) x i ), the supp(x? )=supp(x 0 ) and sign(x? ) = sign(x 0 ). MC 17-15
54 Sensitivity/stability guarantees 1 x2r n 2 ky Axk2 2 + kxk 1, > 0 (BPDN/LASSO) Noisy case y =Ax 0 + " : Study stability of (BPDN) solution(s) to the noise "? `2 stability : Theorem (Typical statement) Under conditions XX, and choice any solution x? of (BPDN) obeys kx? x 0 k 2 apple C k"k 2. Support and sign stability (more stringent) : Theorem (Typical statement) Under conditions XXXX, and choice unique solution x? of (BPDN) obeys = c k"k 2, there exists C such that = f(k"k 2, i2supp(x) x i ), the supp(x? )=supp(x 0 ) and sign(x? ) = sign(x 0 ). Again uniform vs non-uniform guarantees. Sample complexity bounds (random settings) : can we constrict sensing matrices s.t. the above conditions hold? What are the optimal scalings of the problem dimensions (n, m, s)? Necessary conditions? What if x 0 is only weakly sparse? MC 17-15
55 Sensitivity/stability guarantees Original Recovered Stable support MC 17-16
56 Sensitivity/stability guarantees Original Recovered Stable support No stable support but `2stability MC 17-16
57 Sensitivity/stability guarantees Original Recovered Stable support No stable support but `2stability In some applications, what matters is stability of the support MC 17-16
58 Guarantees from a geometrical perspective MC 17-17
59 Notions of convex analysis Convex sets Non-convex sets MC 17-18
60 Notions of convex analysis Convex sets Non-convex sets a (C) 0 C MC 17-18
61 Notions of convex analysis Convex sets Non-convex sets a (C) 0 C Definition (Relative interior) The relative interior ri(c) of a convex set C is its interior relative to a (C). C a (C) ri(c) {x} {x} {x} [x, x 0 ] line generated by (x, x 0 ) ]x, x 0 [ Simplex in R n P n i=1 x P n i =1 i=1 x i =1,x i 2]0, 1[ MC 17-18
62 Subdifferential Definition (Subdifferential) The subdifferential of a convex function at x 2 R n is the set of slopes of affine functions orizing f at x, ={u 2 R n : 8z 2 R n,f(z) f(x)+hu, z xi}. x epi( ) 0 x MC 17-19
63 Subdifferential Definition (Subdifferential) The subdifferential of a convex function at x 2 R n is the set of slopes of affine functions orizing f at x, ={u 2 R n : 8z 2 R n,f(z) f(x)+hu, z xi}. ={ 1}, x<0 epi( ) 0 x MC 17-19
64 Subdifferential Definition (Subdifferential) The subdifferential of a convex function at x 2 R n is the set of slopes of affine functions orizing f at x, ={u 2 R n : 8z 2 R n,f(z) f(x)+hu, z xi}. ={ 1}, x<0 epi( ={1}, x>0 0 x MC 17-19
65 Subdifferential Definition (Subdifferential) The subdifferential of a convex function at x 2 R n is the set of slopes of affine functions orizing f at x, ={u 2 R n : 8z 2 R n,f(z) f(x)+hu, z xi}. ={ 1}, x<0 epi( ={1}, = [ 1, 1] 0 x MC 17-19
66 Subdifferential Definition (Subdifferential) The subdifferential of a convex function at x 2 R n is the set of slopes of affine functions orizing f at x, ={u 2 R n : 8z 2 R n,f(z) f(x)+hu, z xi}. ={ 1}, x<0 epi( ={1}, = [ 1, 1] 0 kxk 1 = " n i=1@ x i I def supppxq = {u 2 R n : u I = sign(x I ), kuk 1 apple 1}. MC 17-19
67 Normal cone Definition (Normal cone) The normal cone to a set C at x 2 C is N C (x) ={u 2 R n : hu, z xi apple0, 8z 2 C}. MC 17-20
68 Normal cone Definition (Normal cone) The normal cone to a set C at x 2 C is N C (x) ={u 2 R n : hu, z xi apple0, 8z 2 C}. C N C (x) MC 17-20
69 Normal cone Definition (Normal cone) The normal cone to a set C at x 2 C is N C (x) ={u 2 R n : hu, z xi apple0, 8z 2 C}. N C (x) =subspace V? C C =subspace V N C (x) MC 17-20
70 Optimality conditions for (BP) x2r n kxk 1 s.t. y =Ax (BP) MC 17-21
71 Optimality conditions for (BP) x2r n kxk 1 s.t. y =Ax (BP) x? 2 Arg x2r n kxk 1 s.t. y =Ax () 0 kx? k 1 + N ker(a) (x? ) () 0 kx? k 1 + span(a > ) y =Ax x? A > () span(a > ) kx? k 1 6= ; () 9 2 R m s.t. I =supp(x def? ) 8 < A > I = sign(x? I ), : A > 1 apple 1. ker(a) MC 17-21
72 Dual certificate Definition The vector 2 R m verifying the source condition A > kx 0 k 1 is called a dual certificate associated to x 0. y =Ax x 0 A > ker(a) MC 17-22
73 Non-degenerate dual certificate Definition The vector 2 R m verifying the source condition A > 2 ri(@ kx 0 k 1 ) () A > I = sign((x 0 ) I ) and A > I c 1 <1. is called a non-degenerate dual certificate. I =supp(x def 0 ) MC 17-23
74 Non-degenerate dual certificate Definition The vector 2 R m verifying the source condition A > 2 ri(@ kx 0 k 1 ) () A > I = sign((x 0 ) I ) and A > I c 1 <1. is called a non-degenerate dual certificate. I =supp(x def 0 ) y =Ax x 0 A > ker(a) MC 17-23
75 Non-degenerate dual certificate Definition The vector 2 R m verifying the source condition A > 2 ri(@ kx 0 k 1 ) () A > I = sign((x 0 ) I ) and A > I c 1 <1. is called a non-degenerate dual certificate. I =supp(x def 0 ) y =Ax A > hits the relative boundary () {x : y =Ax} tangent to a higher dimensional face of x 0 () non-unique solution x 0 x 0 A > A > y =Ax ker(a) ker(a) MC 17-23
76 Restricted Injectivity Assumption A I is full column rank, where I def =supp(x 0 ). A natural assumption. Assume noiseless case y =Ax 0 Assume I is known, then y =Ax 0 =A I (x 0 ) I. No hope to recover x 0 uniquely, even knowing its support, if A I has a kernel. All recovery conditions in the literature assume a form of restricted injectivity. MC 17-24
77 Exact recovery x2r n kxk 1 s.t. y =Ax (BP) Theorem Let I =supp(x 0 ). Assume that there exists a non-degenerate dual certificate at x 0 and A I is full-rank. Then x 0 si the unique solution to (BP). Even necessary when x 0 is non-trivial. MC 17-25
78 Stability without support recovery y =Ax 0 + " x2r n 1 2 ky Axk2 2 + kxk 1, > 0 (BPDN/LASSO) Theorem Let I =supp(x 0 ). Assume that there exists a non-degenerate dual certificate at x 0 and A I is full-rank. Then, choosing = c k"k 2, c>0, any imizer x? of (BPDN/LASSO) obeys kx? x 0 k 2 apple C(c, A,I, ) k"k 2. MC 17-26
79 Stability without support recovery y =Ax 0 + " 1 x2r n 2 ky Axk2 2 + kxk 1, > 0 (BPDN/LASSO) Theorem Let I =supp(x 0 ). Assume that there exists a non-degenerate dual certificate at x 0 and A I is full-rank. Then, choosing = c k"k 2, c>0, any imizer x? of (BPDN/LASSO) obeys kx? x 0 k 2 apple C(c, A,I, ) k"k 2. Even necessary when x 0 is non-trivial. MC 17-26
80 Stable support and sign recovery y =Ax 0 + " 1 x2r n 2 ky Axk2 2 + kxk 1, > 0 (BPDN/LASSO) Theorem Let I =supp(x 0 ). Assume that A I is full-rank and F =A I (A > I A I ) 1 sign((x 0 ) I ) is a non-degenerate dual certificate. Then, choosing c 1 k"k 2 < <c 2 i2i (x 0) i, (BPDN/LASSO) has a unique solution x? which moreover satisfies supp(x? )=I and sign(x? ) = sign(x 0 ). MC 17-27
81 Stable support and sign recovery y =Ax 0 + " 1 x2r n 2 ky Axk2 2 + kxk 1, > 0 (BPDN/LASSO) Theorem Let I =supp(x 0 ). Assume that A I is full-rank and F =A I (A > I A I ) 1 sign((x 0 ) I ) is a non-degenerate dual certificate. Then, choosing c 1 k"k 2 < <c 2 i2i (x 0) i, (BPDN/LASSO) has a unique solution x? which moreover satisfies supp(x? )=I and sign(x? ) = sign(x 0 ). Almost necessary when x 0 is non-trivial. MC 17-27
82 Take-away messages Convex relaxation is good for sparse recovery. Many (tight) guarantees with nice geometrical insight: Exact noiseless recovery. Stability without support recovery. Stable support recovery. Can we translate these conditions into sample complexity bounds? Yes: random measurements (next lecture). MC 17-28
83 Thanks Any questions? MC 17-29
Sparse Optimization Lecture: Dual Certificate in l 1 Minimization
Sparse Optimization Lecture: Dual Certificate in l 1 Minimization Instructor: Wotao Yin July 2013 Note scriber: Zheng Sun Those who complete this lecture will know what is a dual certificate for l 1 minimization
More information1 Sparsity and l 1 relaxation
6.883 Learning with Combinatorial Structure Note for Lecture 2 Author: Chiyuan Zhang Sparsity and l relaxation Last time we talked about sparsity and characterized when an l relaxation could recover the
More informationSparse Optimization Lecture: Basic Sparse Optimization Models
Sparse Optimization Lecture: Basic Sparse Optimization Models Instructor: Wotao Yin July 2013 online discussions on piazza.com Those who complete this lecture will know basic l 1, l 2,1, and nuclear-norm
More informationGeneralized Orthogonal Matching Pursuit- A Review and Some
Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents
More informationModel Selection with Partly Smooth Functions
Model Selection with Partly Smooth Functions Samuel Vaiter, Gabriel Peyré and Jalal Fadili vaiter@ceremade.dauphine.fr August 27, 2014 ITWIST 14 Model Consistency of Partly Smooth Regularizers, arxiv:1405.1004,
More informationECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis
ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 3: Sparse signal recovery: A RIPless analysis of l 1 minimization Yuejie Chi The Ohio State University Page 1 Outline
More informationAn Introduction to Sparse Approximation
An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,
More informationLecture: Introduction to Compressed Sensing Sparse Recovery Guarantees
Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin
More informationLecture 5 : Sparse Models
Lecture 5 : Sparse Models Homework 3 discussion (Nima) Sparse Models Lecture - Reading : Murphy, Chapter 13.1, 13.3, 13.6.1 - Reading : Peter Knee, Chapter 2 Paolo Gabriel (TA) : Neural Brain Control After
More informationIntroduction to Compressed Sensing
Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral
More information2 Regularized Image Reconstruction for Compressive Imaging and Beyond
EE 367 / CS 448I Computational Imaging and Display Notes: Compressive Imaging and Regularized Image Reconstruction (lecture ) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement
More informationCompressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles
Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional
More informationMath 273a: Optimization Overview of First-Order Optimization Algorithms
Math 273a: Optimization Overview of First-Order Optimization Algorithms Wotao Yin Department of Mathematics, UCLA online discussions on piazza.com 1 / 9 Typical flow of numerical optimization Optimization
More informationInverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France
Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Structure of the tutorial Session 1: Introduction to inverse problems & sparse
More informationThe convex algebraic geometry of rank minimization
The convex algebraic geometry of rank minimization Pablo A. Parrilo Laboratory for Information and Decision Systems Massachusetts Institute of Technology International Symposium on Mathematical Programming
More informationL-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise
L-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise Srdjan Stanković, Irena Orović and Moeness Amin 1 Abstract- A modification of standard
More informationOptimization for Compressed Sensing
Optimization for Compressed Sensing Robert J. Vanderbei 2014 March 21 Dept. of Industrial & Systems Engineering University of Florida http://www.princeton.edu/ rvdb Lasso Regression The problem is to solve
More informationExponential Weighted Aggregation vs Penalized Estimation: Guarantees And Algorithms
Exponential Weighted Aggregation vs Penalized Estimation: Guarantees And Algorithms Jalal Fadili Normandie Université-ENSICAEN, GREYC CNRS UMR 6072 Joint work with Tùng Luu and Christophe Chesneau SPARS
More informationAlgorithms for sparse analysis Lecture I: Background on sparse approximation
Algorithms for sparse analysis Lecture I: Background on sparse approximation Anna C. Gilbert Department of Mathematics University of Michigan Tutorial on sparse approximations and algorithms Compress data
More informationA Tutorial on Compressive Sensing. Simon Foucart Drexel University / University of Georgia
A Tutorial on Compressive Sensing Simon Foucart Drexel University / University of Georgia CIMPA13 New Trends in Applied Harmonic Analysis Mar del Plata, Argentina, 5-16 August 2013 This minicourse acts
More informationConstrained optimization
Constrained optimization DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Compressed sensing Convex constrained
More informationLow Complexity Regularization 1 / 33
Low Complexity Regularization 1 / 33 Low-dimensional signal models Information level: pixels large wavelet coefficients (blue = 0) sparse signals low-rank matrices nonlinear models Sparse representations
More informationSCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD
EE-731: ADVANCED TOPICS IN DATA SCIENCES LABORATORY FOR INFORMATION AND INFERENCE SYSTEMS SPRING 2016 INSTRUCTOR: VOLKAN CEVHER SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD STRUCTURED SPARSITY
More informationCompressed Sensing and Sparse Recovery
ELE 538B: Sparsity, Structure and Inference Compressed Sensing and Sparse Recovery Yuxin Chen Princeton University, Spring 217 Outline Restricted isometry property (RIP) A RIPless theory Compressed sensing
More informationGREEDY SIGNAL RECOVERY REVIEW
GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin
More informationECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference
ECE 18-898G: Special Topics in Signal Processing: Sparsity, Structure, and Inference Low-rank matrix recovery via convex relaxations Yuejie Chi Department of Electrical and Computer Engineering Spring
More informationNoisy Signal Recovery via Iterative Reweighted L1-Minimization
Noisy Signal Recovery via Iterative Reweighted L1-Minimization Deanna Needell UC Davis / Stanford University Asilomar SSC, November 2009 Problem Background Setup 1 Suppose x is an unknown signal in R d.
More informationPrimal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector
Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Muhammad Salman Asif Thesis Committee: Justin Romberg (Advisor), James McClellan, Russell Mersereau School of Electrical and Computer
More informationGeneralized greedy algorithms.
Generalized greedy algorithms. François-Xavier Dupé & Sandrine Anthoine LIF & I2M Aix-Marseille Université - CNRS - Ecole Centrale Marseille, Marseille ANR Greta Séminaire Parisien des Mathématiques Appliquées
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationSPARSE signal representations have gained popularity in recent
6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying
More informationNew Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit
New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence
More informationBasis Pursuit Denoising and the Dantzig Selector
BPDN and DS p. 1/16 Basis Pursuit Denoising and the Dantzig Selector West Coast Optimization Meeting University of Washington Seattle, WA, April 28 29, 2007 Michael Friedlander and Michael Saunders Dept
More information1 Regression with High Dimensional Data
6.883 Learning with Combinatorial Structure ote for Lecture 11 Instructor: Prof. Stefanie Jegelka Scribe: Xuhong Zhang 1 Regression with High Dimensional Data Consider the following regression problem:
More informationRecent developments on sparse representation
Recent developments on sparse representation Zeng Tieyong Department of Mathematics, Hong Kong Baptist University Email: zeng@hkbu.edu.hk Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last
More informationComputation and Relaxation of Conditions for Equivalence between l 1 and l 0 Minimization
Computation and Relaxation of Conditions for Equivalence between l and l Minimization Yoav Sharon John Wright Yi Ma April, 28 Abstract In this paper, we investigate the exact conditions under which the
More informationUses of duality. Geoff Gordon & Ryan Tibshirani Optimization /
Uses of duality Geoff Gordon & Ryan Tibshirani Optimization 10-725 / 36-725 1 Remember conjugate functions Given f : R n R, the function is called its conjugate f (y) = max x R n yt x f(x) Conjugates appear
More informationSemidefinite Programming Basics and Applications
Semidefinite Programming Basics and Applications Ray Pörn, principal lecturer Åbo Akademi University Novia University of Applied Sciences Content What is semidefinite programming (SDP)? How to represent
More informationSparse regression. Optimization-Based Data Analysis. Carlos Fernandez-Granda
Sparse regression Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 3/28/2016 Regression Least-squares regression Example: Global warming Logistic
More informationSparse Solutions of an Undetermined Linear System
1 Sparse Solutions of an Undetermined Linear System Maddullah Almerdasy New York University Tandon School of Engineering arxiv:1702.07096v1 [math.oc] 23 Feb 2017 Abstract This work proposes a research
More informationInverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.
Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Overview of the course Introduction sparsity & data compression inverse problems
More informationData Sparse Matrix Computation - Lecture 20
Data Sparse Matrix Computation - Lecture 20 Yao Cheng, Dongping Qi, Tianyi Shi November 9, 207 Contents Introduction 2 Theorems on Sparsity 2. Example: A = [Φ Ψ]......................... 2.2 General Matrix
More informationAn algebraic perspective on integer sparse recovery
An algebraic perspective on integer sparse recovery Lenny Fukshansky Claremont McKenna College (joint work with Deanna Needell and Benny Sudakov) Combinatorics Seminar USC October 31, 2018 From Wikipedia:
More informationIntroduction to Sparsity. Xudong Cao, Jake Dreamtree & Jerry 04/05/2012
Introduction to Sparsity Xudong Cao, Jake Dreamtree & Jerry 04/05/2012 Outline Understanding Sparsity Total variation Compressed sensing(definition) Exact recovery with sparse prior(l 0 ) l 1 relaxation
More information2D X-Ray Tomographic Reconstruction From Few Projections
2D X-Ray Tomographic Reconstruction From Few Projections Application of Compressed Sensing Theory CEA-LID, Thalès, UJF 6 octobre 2009 Introduction Plan 1 Introduction 2 Overview of Compressed Sensing Theory
More informationSparsity in Underdetermined Systems
Sparsity in Underdetermined Systems Department of Statistics Stanford University August 19, 2005 Classical Linear Regression Problem X n y p n 1 > Given predictors and response, y Xβ ε = + ε N( 0, σ 2
More informationTutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London
Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation Pier Luigi Dragotti Imperial College London Outline Part 1: Sparse Signal Representation ~90min Part 2: Sparse Sampling ~90min 2
More informationIII. Applications in convex optimization
III. Applications in convex optimization nonsymmetric interior-point methods partial separability and decomposition partial separability first order methods interior-point methods Conic linear optimization
More informationRecent Developments in Compressed Sensing
Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline
More informationAgenda. 1 Duality for LP. 2 Theorem of alternatives. 3 Conic Duality. 4 Dual cones. 5 Geometric view of cone programs. 6 Conic duality theorem
Agenda 1 Duality for LP 2 Theorem of alternatives 3 Conic Duality 4 Dual cones 5 Geometric view of cone programs 6 Conic duality theorem 7 Examples Lower bounds on LPs By eliminating variables (if needed)
More informationMultiple Change Point Detection by Sparse Parameter Estimation
Multiple Change Point Detection by Sparse Parameter Estimation Department of Econometrics Fac. of Economics and Management University of Defence Brno, Czech Republic Dept. of Appl. Math. and Comp. Sci.
More informationCSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming
CSC2411 - Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming Notes taken by Mike Jamieson March 28, 2005 Summary: In this lecture, we introduce semidefinite programming
More informationOptimisation Combinatoire et Convexe.
Optimisation Combinatoire et Convexe. Low complexity models, l 1 penalties. A. d Aspremont. M1 ENS. 1/36 Today Sparsity, low complexity models. l 1 -recovery results: three approaches. Extensions: matrix
More informationMathematical introduction to Compressed Sensing
Mathematical introduction to Compressed Sensing Lesson 1 : measurements and sparsity Guillaume Lecué ENSAE Mardi 31 janvier 2016 Guillaume Lecué (ENSAE) Compressed Sensing Mardi 31 janvier 2016 1 / 31
More informationBias-free Sparse Regression with Guaranteed Consistency
Bias-free Sparse Regression with Guaranteed Consistency Wotao Yin (UCLA Math) joint with: Stanley Osher, Ming Yan (UCLA) Feng Ruan, Jiechao Xiong, Yuan Yao (Peking U) UC Riverside, STATS Department March
More informationMLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT
MLCC 2018 Variable Selection and Sparsity Lorenzo Rosasco UNIGE-MIT-IIT Outline Variable Selection Subset Selection Greedy Methods: (Orthogonal) Matching Pursuit Convex Relaxation: LASSO & Elastic Net
More informationGauge optimization and duality
1 / 54 Gauge optimization and duality Junfeng Yang Department of Mathematics Nanjing University Joint with Shiqian Ma, CUHK September, 2015 2 / 54 Outline Introduction Duality Lagrange duality Fenchel
More informationElaine T. Hale, Wotao Yin, Yin Zhang
, Wotao Yin, Yin Zhang Department of Computational and Applied Mathematics Rice University McMaster University, ICCOPT II-MOPTA 2007 August 13, 2007 1 with Noise 2 3 4 1 with Noise 2 3 4 1 with Noise 2
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationSparse analysis Lecture II: Hardness results for sparse approximation problems
Sparse analysis Lecture II: Hardness results for sparse approximation problems Anna C. Gilbert Department of Mathematics University of Michigan Sparse Problems Exact. Given a vector x R d and a complete
More informationRobust Sparse Recovery via Non-Convex Optimization
Robust Sparse Recovery via Non-Convex Optimization Laming Chen and Yuantao Gu Department of Electronic Engineering, Tsinghua University Homepage: http://gu.ee.tsinghua.edu.cn/ Email: gyt@tsinghua.edu.cn
More informationMachine Learning for Signal Processing Sparse and Overcomplete Representations
Machine Learning for Signal Processing Sparse and Overcomplete Representations Abelino Jimenez (slides from Bhiksha Raj and Sourish Chaudhuri) Oct 1, 217 1 So far Weights Data Basis Data Independent ICA
More informationEnhanced Compressive Sensing and More
Enhanced Compressive Sensing and More Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Nonlinear Approximation Techniques Using L1 Texas A & M University
More informationSparse solutions of underdetermined systems
Sparse solutions of underdetermined systems I-Liang Chern September 22, 2016 1 / 16 Outline Sparsity and Compressibility: the concept for measuring sparsity and compressibility of data Minimum measurements
More informationLarge-Scale L1-Related Minimization in Compressive Sensing and Beyond
Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March
More informationINDUSTRIAL MATHEMATICS INSTITUTE. B.S. Kashin and V.N. Temlyakov. IMI Preprint Series. Department of Mathematics University of South Carolina
INDUSTRIAL MATHEMATICS INSTITUTE 2007:08 A remark on compressed sensing B.S. Kashin and V.N. Temlyakov IMI Preprint Series Department of Mathematics University of South Carolina A remark on compressed
More informationMaster 2 MathBigData. 3 novembre CMAP - Ecole Polytechnique
Master 2 MathBigData S. Gaïffas 1 3 novembre 2014 1 CMAP - Ecole Polytechnique 1 Supervised learning recap Introduction Loss functions, linearity 2 Penalization Introduction Ridge Sparsity Lasso 3 Some
More informationError Correction via Linear Programming
Error Correction via Linear Programming Emmanuel Candes and Terence Tao Applied and Computational Mathematics, Caltech, Pasadena, CA 91125 Department of Mathematics, University of California, Los Angeles,
More informationLecture 24: August 28
10-725: Optimization Fall 2012 Lecture 24: August 28 Lecturer: Geoff Gordon/Ryan Tibshirani Scribes: Jiaji Zhou,Tinghui Zhou,Kawa Cheung Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:
More informationConditions for Robust Principal Component Analysis
Rose-Hulman Undergraduate Mathematics Journal Volume 12 Issue 2 Article 9 Conditions for Robust Principal Component Analysis Michael Hornstein Stanford University, mdhornstein@gmail.com Follow this and
More informationCorrupted Sensing: Novel Guarantees for Separating Structured Signals
1 Corrupted Sensing: Novel Guarantees for Separating Structured Signals Rina Foygel and Lester Mackey Department of Statistics, Stanford University arxiv:130554v csit 4 Feb 014 Abstract We study the problem
More informationSparsity Matters. Robert J. Vanderbei September 20. IDA: Center for Communications Research Princeton NJ.
Sparsity Matters Robert J. Vanderbei 2017 September 20 http://www.princeton.edu/ rvdb IDA: Center for Communications Research Princeton NJ The simplex method is 200 times faster... The simplex method is
More informationCSCI 1951-G Optimization Methods in Finance Part 01: Linear Programming
CSCI 1951-G Optimization Methods in Finance Part 01: Linear Programming January 26, 2018 1 / 38 Liability/asset cash-flow matching problem Recall the formulation of the problem: max w c 1 + p 1 e 1 = 150
More informationMinimizing the Difference of L 1 and L 2 Norms with Applications
1/36 Minimizing the Difference of L 1 and L 2 Norms with Department of Mathematical Sciences University of Texas Dallas May 31, 2017 Partially supported by NSF DMS 1522786 2/36 Outline 1 A nonconvex approach:
More informationInderjit Dhillon The University of Texas at Austin
Inderjit Dhillon The University of Texas at Austin ( Universidad Carlos III de Madrid; 15 th June, 2012) (Based on joint work with J. Brickell, S. Sra, J. Tropp) Introduction 2 / 29 Notion of distance
More informationSparse Optimization Lecture: Sparse Recovery Guarantees
Those who complete this lecture will know Sparse Optimization Lecture: Sparse Recovery Guarantees Sparse Optimization Lecture: Sparse Recovery Guarantees Instructor: Wotao Yin Department of Mathematics,
More informationAsynchronous Algorithms for Conic Programs, including Optimal, Infeasible, and Unbounded Ones
Asynchronous Algorithms for Conic Programs, including Optimal, Infeasible, and Unbounded Ones Wotao Yin joint: Fei Feng, Robert Hannah, Yanli Liu, Ernest Ryu (UCLA, Math) DIMACS: Distributed Optimization,
More informationCompressive Sensing Theory and L1-Related Optimization Algorithms
Compressive Sensing Theory and L1-Related Optimization Algorithms Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, USA CAAM Colloquium January 26, 2009 Outline:
More informationLeast Sparsity of p-norm based Optimization Problems with p > 1
Least Sparsity of p-norm based Optimization Problems with p > Jinglai Shen and Seyedahmad Mousavi Original version: July, 07; Revision: February, 08 Abstract Motivated by l p -optimization arising from
More information5742 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 12, DECEMBER /$ IEEE
5742 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 12, DECEMBER 2009 Uncertainty Relations for Shift-Invariant Analog Signals Yonina C. Eldar, Senior Member, IEEE Abstract The past several years
More informationECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis
ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear
More informationCO 250 Final Exam Guide
Spring 2017 CO 250 Final Exam Guide TABLE OF CONTENTS richardwu.ca CO 250 Final Exam Guide Introduction to Optimization Kanstantsin Pashkovich Spring 2017 University of Waterloo Last Revision: March 4,
More informationModel-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk
Model-Based Compressive Sensing for Signal Ensembles Marco F. Duarte Volkan Cevher Richard G. Baraniuk Concise Signal Structure Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional
More informationModule 8 Linear Programming. CS 886 Sequential Decision Making and Reinforcement Learning University of Waterloo
Module 8 Linear Programming CS 886 Sequential Decision Making and Reinforcement Learning University of Waterloo Policy Optimization Value and policy iteration Iterative algorithms that implicitly solve
More informationCSC 576: Variants of Sparse Learning
CSC 576: Variants of Sparse Learning Ji Liu Department of Computer Science, University of Rochester October 27, 205 Introduction Our previous note basically suggests using l norm to enforce sparsity in
More informationOn the interior of the simplex, we have the Hessian of d(x), Hd(x) is diagonal with ith. µd(w) + w T c. minimize. subject to w T 1 = 1,
Math 30 Winter 05 Solution to Homework 3. Recognizing the convexity of g(x) := x log x, from Jensen s inequality we get d(x) n x + + x n n log x + + x n n where the equality is attained only at x = (/n,...,
More informationBayesian Paradigm. Maximum A Posteriori Estimation
Bayesian Paradigm Maximum A Posteriori Estimation Simple acquisition model noise + degradation Constraint minimization or Equivalent formulation Constraint minimization Lagrangian (unconstraint minimization)
More informationA Generalized Uncertainty Principle and Sparse Representation in Pairs of Bases
2558 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 48, NO 9, SEPTEMBER 2002 A Generalized Uncertainty Principle Sparse Representation in Pairs of Bases Michael Elad Alfred M Bruckstein Abstract An elementary
More informationAn E cient A ne-scaling Algorithm for Hyperbolic Programming
An E cient A ne-scaling Algorithm for Hyperbolic Programming Jim Renegar joint work with Mutiara Sondjaja 1 Euclidean space A homogeneous polynomial p : E!R is hyperbolic if there is a vector e 2E such
More informationRobust Principal Component Analysis
ELE 538B: Mathematics of High-Dimensional Data Robust Principal Component Analysis Yuxin Chen Princeton University, Fall 2018 Disentangling sparse and low-rank matrices Suppose we are given a matrix M
More informationFacial Reduction and Geometry on Conic Programming
Facial Reduction and Geometry on Conic Programming Masakazu Muramatsu UEC Bruno F. Lourenço, and Takashi Tsuchiya Seikei University GRIPS Based on the papers of LMT: 1. A structural geometrical analysis
More informationOptimization methods
Optimization methods Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda /8/016 Introduction Aim: Overview of optimization methods that Tend to
More informationEE 381V: Large Scale Optimization Fall Lecture 24 April 11
EE 381V: Large Scale Optimization Fall 2012 Lecture 24 April 11 Lecturer: Caramanis & Sanghavi Scribe: Tao Huang 24.1 Review In past classes, we studied the problem of sparsity. Sparsity problem is that
More informationSparsity Regularization
Sparsity Regularization Bangti Jin Course Inverse Problems & Imaging 1 / 41 Outline 1 Motivation: sparsity? 2 Mathematical preliminaries 3 l 1 solvers 2 / 41 problem setup finite-dimensional formulation
More informationMore First-Order Optimization Algorithms
More First-Order Optimization Algorithms Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapters 3, 8, 3 The SDM
More information6 Compressed Sensing and Sparse Recovery
6 Compressed Sensing and Sparse Recovery Most of us have noticed how saving an image in JPEG dramatically reduces the space it occupies in our hard drives as oppose to file types that save the pixel value
More informationOptimality Conditions for Nonsmooth Convex Optimization
Optimality Conditions for Nonsmooth Convex Optimization Sangkyun Lee Oct 22, 2014 Let us consider a convex function f : R n R, where R is the extended real field, R := R {, + }, which is proper (f never
More informationarxiv: v1 [cs.it] 14 Apr 2009
Department of Computer Science, University of British Columbia Technical Report TR-29-7, April 29 Joint-sparse recovery from multiple measurements Ewout van den Berg Michael P. Friedlander arxiv:94.251v1
More information15. Conic optimization
L. Vandenberghe EE236C (Spring 216) 15. Conic optimization conic linear program examples modeling duality 15-1 Generalized (conic) inequalities Conic inequality: a constraint x K where K is a convex cone
More information- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs
LP-Duality ( Approximation Algorithms by V. Vazirani, Chapter 12) - Well-characterized problems, min-max relations, approximate certificates - LP problems in the standard form, primal and dual linear programs
More information