Generalized greedy algorithms.
|
|
- Annis Jones
- 5 years ago
- Views:
Transcription
1 Generalized greedy algorithms. François-Xavier Dupé & Sandrine Anthoine LIF & I2M Aix-Marseille Université - CNRS - Ecole Centrale Marseille, Marseille ANR Greta Séminaire Parisien des Mathématiques Appliquées à l'imagerie, 05/01/2017
2 Sparse approximation Sparsity is good, Saturn (original) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
3 Sparse approximation Sparsity is good, Saturn (original) Saturn (coecients) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
4 Sparse approximation Sparsity is good, Saturn (original) Saturn (2.6% of the coecients) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
5 Sparse approximation Classically Find the best k-sparse minimizer of y on the dictionary Φ, min y x R Φx 2 n 2 s.t. x 0 k (A) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
6 Sparse approximation Classically Find the best k-sparse minimizer of y on the dictionary Φ, More generally Find the best k-sparse minimizer, min y x R Φx 2 n 2 s.t. x 0 k (A) min f(x) s.t. x 0 k (P) x H F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
7 Sparse approximation Classically Find the best k-sparse minimizer of y on the dictionary Φ, More generally Find the best k-sparse minimizer, min y x R Φx 2 n 2 s.t. x 0 k (A) min f(x) s.t. x 0 k (P) x H Today Find a k-sparse zero of an operator T : H H (e.g. T = f), { x Find x H s.t 0 k 0 = T(x) (Q) H: Hilbert space. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
8 Literature How to solve these problems? F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
9 Literature How to solve these problems? Linear setting (A), at least four families of methods, convex relaxation; MP, OMP; CoSaMP, SP; IHT, HTP. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
10 Literature How to solve these problems? Linear setting (A), at least four families of methods, convex relaxation; MP, OMP; CoSaMP, SP; IHT, HTP. How to generalize? Convergence? F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
11 Literature Generalization for (P), OMP [Zhang 2011], GraSP [Bahmani et. al. 2013], IHT, HTP [Yuan et. al. 2013]; F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
12 Literature Generalization for (P), OMP [Zhang 2011], GraSP [Bahmani et. al. 2013], IHT, HTP [Yuan et. al. 2013]; Today's talk: Goal solve (P) or/and (Q), Approach greedy - theoretically grounded, - inspired by CoSaMP [Needell, Tropp, 2009] and GraSP [Bahmani et. al., 2013]. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
13 Today's topic 1 CoSaMP and its generalizations 2 The Restricted Diagonal Property 3 Three other generalizations 4 Poisson Noise Removal F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
14 Today's topic 1 CoSaMP and its generalizations CoSaMP and its guarantees GraSP and its guarantees GCoSaMP and its guarantees 2 The Restricted Diagonal Property 3 Three other generalizations Generalized Subspace Pursuit Generalized Hard Thresholding Pursuit Generalized Iterative Hard Thresholding 4 Poisson Noise Removal Moreau-Yosida regularization Experiments F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
15 Optimization problem Let T : H H be an operator k the expected sparsity. We wish to solve Find x H such that T(x) = 0 and x 0 k. (Q) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
16 Optimization problem Let T : H H be an operator k the expected sparsity. We wish to solve Find x H such that T(x) = 0 and x 0 k. (Q) Special case: T = f (Q): nd a critical point of f that is k-sparse. Related problem: nd a minimizer of f among the k-sparse vectors. Related algorithms: T(x) = x ( Φx y 2 2): CoSaMP, SP... T(x) = f(x), f convex: GraSP, GOMP... F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
17 CoSaMP [Needell, Tropp, 2009] Algorithm Require: y, Φ, k. Initialization: x 0 = 0. For t = 0 to N 1, g = Φ (Φx t y), G = supp(g 2k ), S = G supp(x t ), z = Φ S y, T = supp(z k ). x t+1 = z k. Output : x N. Goal: min Φx y 2 x 2 s. t. x 0 k (select new directions) (set extended support) (solve on extended support) (set support) (approximately solve on the support) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
18 CoSaMP [Needell, Tropp, 2009] Algorithm Require: y, Φ, k. Initialization: x 0 = 0. For t = 0 to N 1, g = Φ (Φx t y), G = supp(g 2k ), S = G supp(x t ), z = argmin Φx y 2 2, {x/ supp(x) S} T = supp(z k ). x t+1 = z k. Output : x N. Goal: min Φx y 2 x 2 s. t. x 0 k (select new directions) (set extended support) (solve on extended support) (set support) (approximately solve on the support) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
19 CoSaMP [Needell, Tropp, 2009] Algorithm Require: y, Φ, k. Initialization: x 0 = 0. For t = 0 to N 1, G = supp([φ (Φx t y)] 2k ), Goal: min Φx y 2 x 2 s. t. x 0 k (select new directions) S = G supp(x t ), z = argmin Φx y 2 2, {x/ supp(x) S} T = supp(z k ). x t+1 = z k. Output : x N. (set extended support) (solve on extended support) (set support) (approximately solve on the support) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
20 CoSaMP guarantees Restricted Isometry Property Goal: min Φx y 2 x 2 s. t. x 0 k Φ is has the Restricted Isometry Property with constant δ k if and only if x s.t. card(supp(x)) k, (1 δ k ) x Φx (1 + δ k ) x F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
21 CoSaMP guarantees Restricted Isometry Property Goal: min Φx y 2 x 2 s. t. x 0 k Φ is has the Restricted Isometry Property with constant δ k if and only if x s.t. card(supp(x)) k, (1 δ k ) x Φx (1 + δ k ) x Consider y = Φu + e Theorem (CoSaMP error bound) If Φ is has the RIP with constant δ 4k 0.1, then at iteration t, x t veries x t u 1 u + 20ν, 2t with ν the incompressible error: u u k u k u k 1 + e 2. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
22 Gradient Support Pursuit [Bahmani et. al., 2013] CoSaMP Require: f, k. Initialization: x 0 = 0. For t = 0 to N 1, G = supp([φ (Φx t y)] 2k ), S = G supp(x t ), z = argmin Φx y 2 2, {x/ supp(x) S} T = supp(z k ). x t+1 = z k. Output : x N. Goal: min f(x) s. t. x x 0 k (select new directions) (set extended support) (solve on extended support) (set support) (approximately solve on the support) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
23 Gradient Support Pursuit [Bahmani et. al., 2013] Goal: min f(x) s. t. x x 0 k Gradient Support Pursuit (GraSP) Require: f, k. Initialization: x 0 = 0. For t = 0 to N 1, G = supp([ f(x)] 2k ), S = G supp(x t ), z argmin {x/ supp(x) S} f(x), T = supp(z k ). x t+1 = z k. Output : x N. (select new directions) (set extended support) (solve on extended support) (set support) (approximately solve on the support) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
24 GraSP guarantees Goal: min f(x) s. t. x x 0 k Stable Restricted Hessian f has a Stable Restricted Hessian with constant µ k if and only if A k (x) B k (x) µ k, x s.t. card(supp(x)) k, where { } y, H f (x)y A k (x) = sup y 2 card(supp(x) supp(y)) k, 2 { } y, H f (x)y B k (x) = inf y 2 card(supp(x) supp(y)) k. 2 F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
25 GraSP guarantees Goal: min f(x) s. t. x x 0 k Theorem (GraSP error bound) f has a Stable Restricted Hessian with constant µ 4k and there exists ɛ > 0 such that B 4k (u) > ɛ u, then at iteration t, x t veries x t u 1 2 t u + C ɛ f(u ) 3k. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
26 GraSP guarantees Goal: min f(x) s. t. x x 0 k Theorem (GraSP error bound) f has a Stable Restricted Hessian with constant µ 4k and there exists ɛ > 0 such that B 4k (u) > ɛ u, then at iteration t, x t veries x t u 1 2 t u + C ɛ f(u ) 3k. Error bound for f only once dierentiable. Notion of convexity restricted to k-sparse vectors. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
27 Generalized CoSaMP Goal: Find x H s. t. T(x) = 0 and x 0 k Gradient Support Pursuit (GraSP) Require: T, k. Initialization: x 0 = 0. For t = 0 to N 1, G = supp([ f(x)] 2k ), S = G supp(x t ), (select new directions) (set extended support) z argmin {x/ supp(x) S} f(x), (solve on extended support) T = supp(z k ). (set support) x t+1 = z k. Output : x N. (approximately solve on the support) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
28 Generalized CoSaMP Goal: Find x H s. t. T(x) = 0 and x 0 k GCoSaMP Require: T, k. Initialization: x 0 = 0. For t = 0 to N 1, G = supp([t(x)] 2k ), S = G supp(x t ), { supp(z) S z s. t. T(z) S = 0. T = supp(z k ). x t+1 = z k. Output : x N. (select new directions) (set extended support), (solve on extended support) (set support) (approximately solve on the support) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
29 Uniform Restricted Diagonal Property H H, D 1 = D : x i d i x i e i s.t. x Dx x. Denition (Uniform Restricted Diagonal Property) T is said to have the Uniform Restricted Diagonal Property (URDP) of order k if there exists α k > 0 and a diagonal operator D k in D 1 such that (x, y) H 2, card(supp(x) supp(y)) k T(x) T(y) D k (x y) α k x y. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
30 Uniform Restricted Diagonal Property H H, D 1 = D : x i d i x i e i s.t. x Dx x. Denition (Restricted Diagonal Property) T is said to have the Restricted Diagonal Property (RDP) of order k if there exists α k > 0 such that for all subsets S of N of cardinal at most k, there exists a diagonal operator D S in D 1 such that (x, y) H 2, supp(x) S supp(y) S } T(x) T(y) D S (x y) α k x y. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
31 Generalized CoSaMP Goal: Find x H s. t. T(x) = 0 and x 0 k Theorem (Generalized CoSaMP error bound) Denote by x any k-sparse vector and α C = If there exists ρ > 0 such that ρt has the Restricted Diagonal Property of order 4k with α 4k α C then at iteration t, x t veries x t x 1 2 t x + 12ρ T(x ) 3k. (1) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
32 Today's topic 1 CoSaMP and its generalizations CoSaMP and its guarantees GraSP and its guarantees GCoSaMP and its guarantees 2 The Restricted Diagonal Property 3 Three other generalizations Generalized Subspace Pursuit Generalized Hard Thresholding Pursuit Generalized Iterative Hard Thresholding 4 Poisson Noise Removal Moreau-Yosida regularization Experiments F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
33 The Restricted Diagonal Property and RIP Denition (Restricted Diagonal Property) T has RDP of order k : α k > 0, S N, with S k, D S D 1 such that supp(x) S & supp(y) S T(x) T(y) D S (x y) α k x y. Assume that T has RDP of order 2k, then if x 0 k and y 0 k: T(x) T(y) (1 α 2k ) x y (injectivity) F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
34 The Restricted Diagonal Property and RIP Denition (Restricted Diagonal Property) T has RDP of order k : α k > 0, S N, with S k, D S D 1 such that supp(x) S & supp(y) S T(x) T(y) D S (x y) α k x y. Assume that T has RDP of order 2k, then if x 0 k and y 0 k: T(x) T(y) (1 α 2k ) x y (injectivity) (1 α 2k ) x y T(x) T(y) (D + α 2k ) x y (D = D 2k or sup D S if exists). F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
35 The Restricted Diagonal Property and RIP Denition (Restricted Diagonal Property) T has RDP of order k : α k > 0, S N, with S k, D S D 1 such that supp(x) S & supp(y) S T(x) T(y) D S (x y) α k x y. Assume that T has RDP of order 2k, then if x 0 k and y 0 k: T(x) T(y) (1 α 2k ) x y (injectivity) (1 α 2k ) x y T(x) T(y) (D + α 2k ) x y (D = D 2k or sup D S if exists). T(x) = Φ (Φx z) Φ is RIP. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
36 Uniform Restricted Diagonal Property: characterization Theorem βt is URDP of order k for D, with α k < 1 and β > 0 (m, L) such that 0 < m and 0 D 2 m2 L 2 < 1 and supp(x) supp(y) k { T(x) T(y) L x y T(x) T(y), D(x y) m x y 2. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
37 Uniform Restricted Diagonal Property: characterization Theorem βt is URDP of order k for D, with α k < 1 and β > 0 (m, L) such that 0 < m and 0 D 2 m2 L 2 < 1 and supp(x) supp(y) k { T(x) T(y) L x y T(x) T(y), D(x y) m x y 2. L-Lipschitz property on sparse elements. D = I: monotone operator. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
38 Uniform Restricted Diagonal Property: characterization Theorem β f is URDP of order k for D, with α k < 1 and β > 0 (m, L) such that 0 < m and 0 D 2 m2 L 2 < 1 and supp(x) supp(y) k { f(x) f(y) L x y f(x) f(y), D(x y) m x y 2. L-Lipschitz property on sparse elements. D = I: monotone operator. If T = f L-Lipschitz property: Restricted Strong Smoothness. if D = I: Restricted Strong Convexity recovers the conditions of [Bahmani et. al., 2013]. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
39 Today's topic 1 CoSaMP and its generalizations CoSaMP and its guarantees GraSP and its guarantees GCoSaMP and its guarantees 2 The Restricted Diagonal Property 3 Three other generalizations Generalized Subspace Pursuit Generalized Hard Thresholding Pursuit Generalized Iterative Hard Thresholding 4 Poisson Noise Removal Moreau-Yosida regularization Experiments F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
40 Generalized Subspace Pursuit GCoSaMP Require: T, k. GSP Require: T, k. Initialization: x 0 = 0. Initialization: x 0 = 0. For t = 0 to N 1, G = supp([t(x)] 2k ), For t = 0 to N 1, G = supp([t(x)] k ), S = G supp(x t ), S = G supp(x t ), z s. t. { supp(z) S T(z) S = 0. T = supp(z k ). x t+1 = z k. Output : x N., { supp(z) S z s. t., T(z) S = 0. T = supp(z k ). { supp(x x t+1 s. t. t+1 ) T T(x t+1 ) S = 0. Output : x N.. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
41 x is k-sparse. α S is the real root of x 3 + x 2 + 7x 1. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38 Generalized Subspace Pursuit Theorem (Generalized CoSaMP error bound) If ρt has the Restricted Diagonal Property of order 4k with α 4k α C = then at iteration t of GCoSaMP, x t veries x t x 1 2 t x + 12ρ T(x ) 3k. Theorem (Generalized Subspace Pursuit error bound) If ρt has the Restricted Diagonal Property of order 3k with α 3k α S then at iteration t of GSP, x t veries x t x 1 2 x + 12ρ T(x ) t 2k.
42 Generalized Hard Thresholding Pursuit GCoSaMP GHTP Require: T, k. Require: T, k, η. Initialization: x 0 = 0. Initialization: x 0 = 0. For t = 0 to N 1, For t = 0 to N 1, G = supp([t(x)] 2k ), G = supp([t(x)] k ), S = G supp(x t ), S = G supp(x t ), z s. t. { supp(z) S T(z) S = 0. T = supp(z k ). x t+1 = z k. Output : x N., z = [ (I ηt)(x t ) ] S, T = supp(z k ). { supp(x x t+1 s. t. t+1 ) T T(x t+1 ) S = 0. Output : x N.. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
43 Generalized Hard Thresholding Pursuit Theorem (Generalized CoSaMP error bound) If ρt has the Restricted Diagonal Property of order 4k with α 4k α C = then at iteration t of GCoSaMP, x t veries x t x 1 2 t x + 12ρ T(x ) 3k. Theorem (Generalized HTP error bound) If T has the Uniform Restricted Diagonal Property of order 2k with D 2k = I, α 2k α H and 3 4 < η < 5 4, then at iteration t of GHTP, xt veries x t x 1 2 x + 2 (1+2η)(1 α 2k)+4 T(x t (1 α 2k ) ) 2 2k. (2) x is k-sparse. α H = F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
44 Generalized Iterative Hard Thresholding GCoSaMP Require: T, k. Initialization: x 0 = 0. For t = 0 to N 1, G = supp([t(x)] 2k ), S = G supp(x t ), { supp(z) S z s. t. T(z) S = 0. T = supp(z k ). x t+1 = z k. Output : x N., GIHT Require: T, k, η. Initialization: x 0 = 0. For t = 0 to N 1, z = (I ηt)(x t ), T = supp(z k ). x t+1 = z k. Output : x N. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
45 Generalized Iterative Hard Thresholding Theorem (Generalized CoSaMP error bound) If ρt has the Restricted Diagonal Property of order 4k with α 4k α C = then at iteration t of GCoSaMP, x t veries x t x 1 2 t x + 12ρ T(x ) 3k. Theorem (Generalized IHT error bound) If T has the Uniform Restricted Diagonal Property of order 2k with D 2k = I, 3 4 < η < 5 4 and α 2k α η then at iteration t of GIHT, x t veries x t x 1 2 t x + 4η T(x ) 3k. x is k-sparse. α η = 1 4 η 1 4(1+ η 1 ). F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
46 About these error bounds For all four algorithms Guaranteed convergence to unique solution if it exists, at an exponential rate. Incompressible error of the form T (x) k. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
47 About these error bounds For all four algorithms Guaranteed convergence to unique solution if it exists, at an exponential rate. Incompressible error of the form T (x) k. For GCoSaMP and GSP Invariance to scaling of the algorithm, no parameters to set. Guarantee in the RDP case (no monotonicity of T or convexity of f required). F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
48 About these error bounds For all four algorithms Guaranteed convergence to unique solution if it exists, at an exponential rate. Incompressible error of the form T (x) k. For GCoSaMP and GSP Invariance to scaling of the algorithm, no parameters to set. Guarantee in the RDP case (no monotonicity of T or convexity of f required). For GHTP and GIHT Requires careful setting of the step η w.r.t to T. Guarantee only in the URDP case with D = I (monotonicity of T or convexity of f required). F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
49 Today's topic 1 CoSaMP and its generalizations CoSaMP and its guarantees GraSP and its guarantees GCoSaMP and its guarantees 2 The Restricted Diagonal Property 3 Three other generalizations Generalized Subspace Pursuit Generalized Hard Thresholding Pursuit Generalized Iterative Hard Thresholding 4 Poisson Noise Removal Moreau-Yosida regularization Experiments F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
50 Forward model The observed data y is a Poisson noise corrupted version of x, y P(x). Sparsity assumption x = Φα with α 0 k and k << n. with y the observation (e.g. a n n image), x the true image, Φ R n d a dictionary (redundant if d > n), α coecients to be found (x = Φα). F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
51 Regularized Sparse Poisson denoising Writing F y (x) = log P (y x), one naturally seeks to Minimize the neg-log-likelihood under a sparsity constraint F y : x R n min F y(φ(α)) s. t. α 0 k. α R d n fy(x[i]), i with i=1 y[i] log(ξ) + ξ if y[i] > 0 and ξ > 0, fy(ξ) i = ξ if y[i] = 0 and ξ 0, + otherwise. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
52 Regularized Sparse Poisson denoising Writing F y (x) = log P (y x), one naturally seeks to Minimize the neg-log-likelihood under a sparsity constraint F y : x R n Issues : min F y(φ(α)) s. t. α 0 k. α R d n fy(x[i]), i with i=1 y[i] log(ξ) + ξ if y[i] > 0 and ξ > 0, fy(ξ) i = ξ if y[i] = 0 and ξ 0, + otherwise. no full domain, no tolerance if a pixel is removed (set to 0), non-lipschitz gradient. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
53 Regularized Sparse Poisson denoising Instead, we propose Min. a regularized neg-log-likelihood under a sparsity constraint which is equivalent to Using GCoSaMP/GSP with min M λ,f y (Φ(α)) s. t. α 0 k. α R d T = 1 νλ Φ (I prox νλfy ) Φ M λ,f is the Moreau-Yosida regularization of f with parameter λ. prox f is the proximal operator. ν is the frame bound for Φ. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
54 Experiments We compare using GSP, GCoSaMP, GHTP or GIHT to solve min M λ,f y Φ(α) s. t. α 0 k, α R d F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
55 Experiments We compare using GSP, GCoSaMP, GHTP or GIHT to solve min M λ,f y Φ(α) s. t. α 0 k, α R d to Subspace Pursuit (SP):[Dai and Milenkovic, 2009] min Φ(α) y 2 α R d 2 s. t. α 0 k, l 1 -relaxation: using Forward-Backward-Forward primal-dual algorithm [Combettes et. al. 2012] to solve min F y Φ(α) + γ α 1, α R d SAFIR: adaptation of BM3D [Boulanger et al., 2010]. MSVST: variance-stabilizing method [Zhang et al., 2008]. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
56 Comparing greedy l 0 and l 1 results (a) Original (b) Noisy (c) GSP (d) GHTP (e) GCoSaMP Figure : Cameraman, maximal intensity 30, undecimated wavelet transform. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
57 Comparing greedy l 0 and l 1 results (c) GSP (d) GHTP (e) GCoSaMP (f) GIHT (g) l 1 method Figure : Cameraman, maximal intensity 30, undecimated wavelet transform. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
58 Inuence of λ in M λ, Original Noisy λ = 10, MAE=0.81 λ = 0.1, MAE=0.74 Maximal Intensity: 10, k: 1500, Φ: cycle-spinning wavelet transform. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
59 Results (a) Original (b) Noisy (c) GSP (d) SP (e) l 1 -relaxation Figure : NGC 2997 Galaxy image. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
60 Results (c) GSP (d) SP (e) l 1 -relaxation (f) SAFIR (g) MSVST Figure : NGC 2997 Galaxy image. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
61 Numerical results Sparse Cameraman Galaxy MAE SSIM MAE SSIM Noisy GSP SP SAFIR MSVST l 1-relaxation Table : Comparison of denoising methods on a sparse version of Cameraman (k/n = 0.15) and the NGC 2997 Galaxy. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
62 Take away messages We have presented a general greedy optimization algorithms, with theoretical guarantees; applied it to a Moreau-Yosida regularization of Poisson likelihood; shown encouraging results for Poisson denoising. Perspectives includes application in other known settings such as dictionary learning, additional regularization... gain understanding on the behavior of GCoSaMP, GSP and co. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
63 Thanks for your attention. Any questions? F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
64 Moreau-Yosida regularization Denition (Lemaréchal et al, 1997) Let f : R d R {+ } be a proper, convex and lower semi-continuous function. The Moreau-Yosida regularization of f with parameter λ is: M λ,f : R d R, [ s inf 1 x R d 2λ s x 2 + f(x) ] F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
65 Moreau-Yosida regularization Denition (Lemaréchal et al, 1997) Let f : R d R {+ } be a proper, convex and lower semi-continuous function. The Moreau-Yosida regularization of f with parameter λ is: Remarks: M λ,f : R d R, [ s inf 1 x R d 2λ s x 2 + f(x) ] The gradient of M λ,f is linked with proximal operator. λ regulates the similarity between f and M λ,f. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
66 Gradient for the Moreau-Yosida regularization of Poisson likelihood Proposition (Combettes and Pesquet, 2007) Let Φ is a tight frame (i.e. ν > 0, such that Φ Φ = νi), then the gradient of the Moreau-Yosida regularization of F y Φ is: M λ,fy Φ(x) = 1 νλ Φ (I prox νλfy ) Φ with prox νλfy (x)[i] = x[i] νλ + x[i] νλ 2 + 4νλy[i] 2 F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
67 Optimization problem with, Φ the dictionary, min M λ,f y Φ(α) s. t. α 0 k (P), α R d λ the regularization parameter, k sought sparsity. F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
68 Optimization problem with, Φ the dictionary, min M λ,f y Φ(α) s. t. α 0 k (P), α R d λ the regularization parameter, k sought sparsity. The bias between the solution of (P) and the non-regularized problem, under mild condition, is O( λ). [Mahey et Tao, 1993] F-X Dupé, S. Anthoine (LIF, I2M) Generalized greedy algorithms. 05/01/ / 38
Generalized Orthogonal Matching Pursuit- A Review and Some
Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents
More informationAnalysis of Greedy Algorithms
Analysis of Greedy Algorithms Jiahui Shen Florida State University Oct.26th Outline Introduction Regularity condition Analysis on orthogonal matching pursuit Analysis on forward-backward greedy algorithm
More informationGreed is Fine: on Finding Sparse Zeros of Hilbert Operators
: on Finding Sparse Zeros of Hilbert Operators François-Xavier Dupé To cite this version: François-Xavier Dupé. Greed is Fine: on Finding Sparse Zeros of Hilbert Operators. 2015. HAL Id:
More informationThe Analysis Cosparse Model for Signals and Images
The Analysis Cosparse Model for Signals and Images Raja Giryes Computer Science Department, Technion. The research leading to these results has received funding from the European Research Council under
More informationMultipath Matching Pursuit
Multipath Matching Pursuit Submitted to IEEE trans. on Information theory Authors: S. Kwon, J. Wang, and B. Shim Presenter: Hwanchol Jang Multipath is investigated rather than a single path for a greedy
More informationGreedy Signal Recovery and Uniform Uncertainty Principles
Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles
More informationSparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28
Sparsity Models Tong Zhang Rutgers University T. Zhang (Rutgers) Sparsity Models 1 / 28 Topics Standard sparse regression model algorithms: convex relaxation and greedy algorithm sparse recovery analysis:
More informationAbout Split Proximal Algorithms for the Q-Lasso
Thai Journal of Mathematics Volume 5 (207) Number : 7 http://thaijmath.in.cmu.ac.th ISSN 686-0209 About Split Proximal Algorithms for the Q-Lasso Abdellatif Moudafi Aix Marseille Université, CNRS-L.S.I.S
More information1 Sparsity and l 1 relaxation
6.883 Learning with Combinatorial Structure Note for Lecture 2 Author: Chiyuan Zhang Sparsity and l relaxation Last time we talked about sparsity and characterized when an l relaxation could recover the
More informationCompressive Sensing and Beyond
Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered
More informationNew Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit
New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence
More informationGreedy Sparsity-Constrained Optimization
Greedy Sparsity-Constrained Optimization Sohail Bahmani, Petros Boufounos, and Bhiksha Raj 3 sbahmani@andrew.cmu.edu petrosb@merl.com 3 bhiksha@cs.cmu.edu Department of Electrical and Computer Engineering,
More informationMLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT
MLCC 2018 Variable Selection and Sparsity Lorenzo Rosasco UNIGE-MIT-IIT Outline Variable Selection Subset Selection Greedy Methods: (Orthogonal) Matching Pursuit Convex Relaxation: LASSO & Elastic Net
More informationOslo Class 6 Sparsity based regularization
RegML2017@SIMULA Oslo Class 6 Sparsity based regularization Lorenzo Rosasco UNIGE-MIT-IIT May 4, 2017 Learning from data Possible only under assumptions regularization min Ê(w) + λr(w) w Smoothness Sparsity
More informationRestricted Strong Convexity Implies Weak Submodularity
Restricted Strong Convexity Implies Weak Submodularity Ethan R. Elenberg Rajiv Khanna Alexandros G. Dimakis Department of Electrical and Computer Engineering The University of Texas at Austin {elenberg,rajivak}@utexas.edu
More informationGREEDY SIGNAL RECOVERY REVIEW
GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin
More informationModel-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk
Model-Based Compressive Sensing for Signal Ensembles Marco F. Duarte Volkan Cevher Richard G. Baraniuk Concise Signal Structure Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional
More informationThree Generalizations of Compressed Sensing
Thomas Blumensath School of Mathematics The University of Southampton June, 2010 home prev next page Compressed Sensing and beyond y = Φx + e x R N or x C N x K is K-sparse and x x K 2 is small y R M or
More informationCoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp
CoSaMP Iterative signal recovery from incomplete and inaccurate samples Joel A. Tropp Applied & Computational Mathematics California Institute of Technology jtropp@acm.caltech.edu Joint with D. Needell
More informationSignal Sparsity Models: Theory and Applications
Signal Sparsity Models: Theory and Applications Raja Giryes Computer Science Department, Technion Michael Elad, Technion, Haifa Israel Sangnam Nam, CMI, Marseille France Remi Gribonval, INRIA, Rennes France
More informationAn Introduction to Sparse Approximation
An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,
More informationRecovering overcomplete sparse representations from structured sensing
Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix
More informationRecovery of Sparse Signals Using Multiple Orthogonal Least Squares
Recovery of Sparse Signals Using Multiple Orthogonal east Squares Jian Wang, Ping i Department of Statistics and Biostatistics arxiv:40.505v [stat.me] 9 Oct 04 Department of Computer Science Rutgers University
More informationRecent Developments in Compressed Sensing
Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline
More informationA Tight Bound of Hard Thresholding
Journal of Machine Learning Research 18 018) 1-4 Submitted 6/16; Revised 5/17; Published 4/18 A Tight Bound of Hard Thresholding Jie Shen Department of Computer Science Rutgers University Piscataway, NJ
More informationMachine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013
Machine Learning for Signal Processing Sparse and Overcomplete Representations Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013 1 Key Topics in this Lecture Basics Component-based representations
More informationIntroduction to Compressed Sensing
Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral
More informationSparsity and Compressed Sensing
Sparsity and Compressed Sensing Jalal Fadili Normandie Université-ENSICAEN, GREYC Mathematical coffees 2017 Recap: linear inverse problems Dictionary Sensing Sensing Sensing = y m 1 H m n y y 2 R m H A
More informationOn Iterative Hard Thresholding Methods for High-dimensional M-Estimation
On Iterative Hard Thresholding Methods for High-dimensional M-Estimation Prateek Jain Ambuj Tewari Purushottam Kar Microsoft Research, INDIA University of Michigan, Ann Arbor, USA {prajain,t-purkar}@microsoft.com,
More informationPerturbed Proximal Gradient Algorithm
Perturbed Proximal Gradient Algorithm Gersende FORT LTCI, CNRS, Telecom ParisTech Université Paris-Saclay, 75013, Paris, France Large-scale inverse problems and optimization Applications to image processing
More informationSplitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches
Splitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches Patrick L. Combettes joint work with J.-C. Pesquet) Laboratoire Jacques-Louis Lions Faculté de Mathématiques
More informationMIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design
MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications Class 19: Data Representation by Design What is data representation? Let X be a data-space X M (M) F (M) X A data representation
More informationOrthogonal Matching Pursuit for Sparse Signal Recovery With Noise
Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published
More informationOptimization Algorithms for Compressed Sensing
Optimization Algorithms for Compressed Sensing Stephen Wright University of Wisconsin-Madison SIAM Gator Student Conference, Gainesville, March 2009 Stephen Wright (UW-Madison) Optimization and Compressed
More informationSparsity in Underdetermined Systems
Sparsity in Underdetermined Systems Department of Statistics Stanford University August 19, 2005 Classical Linear Regression Problem X n y p n 1 > Given predictors and response, y Xβ ε = + ε N( 0, σ 2
More informationSplitting methods for decomposing separable convex programs
Splitting methods for decomposing separable convex programs Philippe Mahey LIMOS - ISIMA - Université Blaise Pascal PGMO, ENSTA 2013 October 4, 2013 1 / 30 Plan 1 Max Monotone Operators Proximal techniques
More informationc 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE
METHODS AND APPLICATIONS OF ANALYSIS. c 2011 International Press Vol. 18, No. 1, pp. 105 110, March 2011 007 EXACT SUPPORT RECOVERY FOR LINEAR INVERSE PROBLEMS WITH SPARSITY CONSTRAINTS DENNIS TREDE Abstract.
More informationInverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France
Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Structure of the tutorial Session 1: Introduction to inverse problems & sparse
More informationMachine Learning for Signal Processing Sparse and Overcomplete Representations
Machine Learning for Signal Processing Sparse and Overcomplete Representations Abelino Jimenez (slides from Bhiksha Raj and Sourish Chaudhuri) Oct 1, 217 1 So far Weights Data Basis Data Independent ICA
More informationITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION
ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion Israel Institute o technology Haia 3000, Israel * SIAM Conerence on Imaging Science
More informationMATCHING PURSUIT WITH STOCHASTIC SELECTION
2th European Signal Processing Conference (EUSIPCO 22) Bucharest, Romania, August 27-3, 22 MATCHING PURSUIT WITH STOCHASTIC SELECTION Thomas Peel, Valentin Emiya, Liva Ralaivola Aix-Marseille Université
More informationMaster 2 MathBigData. 3 novembre CMAP - Ecole Polytechnique
Master 2 MathBigData S. Gaïffas 1 3 novembre 2014 1 CMAP - Ecole Polytechnique 1 Supervised learning recap Introduction Loss functions, linearity 2 Penalization Introduction Ridge Sparsity Lasso 3 Some
More informationSparse Covariance Selection using Semidefinite Programming
Sparse Covariance Selection using Semidefinite Programming A. d Aspremont ORFE, Princeton University Joint work with O. Banerjee, L. El Ghaoui & G. Natsoulis, U.C. Berkeley & Iconix Pharmaceuticals Support
More informationProximal tools for image reconstruction in dynamic Positron Emission Tomography
Proximal tools for image reconstruction in dynamic Positron Emission Tomography Nelly Pustelnik 1 joint work with Caroline Chaux 2, Jean-Christophe Pesquet 3, and Claude Comtat 4 1 Laboratoire de Physique,
More informationBayesian Paradigm. Maximum A Posteriori Estimation
Bayesian Paradigm Maximum A Posteriori Estimation Simple acquisition model noise + degradation Constraint minimization or Equivalent formulation Constraint minimization Lagrangian (unconstraint minimization)
More informationA Primal-dual Three-operator Splitting Scheme
Noname manuscript No. (will be inserted by the editor) A Primal-dual Three-operator Splitting Scheme Ming Yan Received: date / Accepted: date Abstract In this paper, we propose a new primal-dual algorithm
More informationAccelerated Block-Coordinate Relaxation for Regularized Optimization
Accelerated Block-Coordinate Relaxation for Regularized Optimization Stephen J. Wright Computer Sciences University of Wisconsin, Madison October 09, 2012 Problem descriptions Consider where f is smooth
More informationNoisy Signal Recovery via Iterative Reweighted L1-Minimization
Noisy Signal Recovery via Iterative Reweighted L1-Minimization Deanna Needell UC Davis / Stanford University Asilomar SSC, November 2009 Problem Background Setup 1 Suppose x is an unknown signal in R d.
More informationA Parallel Block-Coordinate Approach for Primal-Dual Splitting with Arbitrary Random Block Selection
EUSIPCO 2015 1/19 A Parallel Block-Coordinate Approach for Primal-Dual Splitting with Arbitrary Random Block Selection Jean-Christophe Pesquet Laboratoire d Informatique Gaspard Monge - CNRS Univ. Paris-Est
More informationMIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications. Class 08: Sparsity Based Regularization. Lorenzo Rosasco
MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications Class 08: Sparsity Based Regularization Lorenzo Rosasco Learning algorithms so far ERM + explicit l 2 penalty 1 min w R d n n l(y
More informationIn collaboration with J.-C. Pesquet A. Repetti EC (UPE) IFPEN 16 Dec / 29
A Random block-coordinate primal-dual proximal algorithm with application to 3D mesh denoising Emilie CHOUZENOUX Laboratoire d Informatique Gaspard Monge - CNRS Univ. Paris-Est, France Horizon Maths 2014
More informationOptimization over Sparse Symmetric Sets via a Nonmonotone Projected Gradient Method
Optimization over Sparse Symmetric Sets via a Nonmonotone Projected Gradient Method Zhaosong Lu November 21, 2015 Abstract We consider the problem of minimizing a Lipschitz dierentiable function over a
More informationCoordinate Update Algorithm Short Course Proximal Operators and Algorithms
Coordinate Update Algorithm Short Course Proximal Operators and Algorithms Instructor: Wotao Yin (UCLA Math) Summer 2016 1 / 36 Why proximal? Newton s method: for C 2 -smooth, unconstrained problems allow
More informationBASICS OF CONVEX ANALYSIS
BASICS OF CONVEX ANALYSIS MARKUS GRASMAIR 1. Main Definitions We start with providing the central definitions of convex functions and convex sets. Definition 1. A function f : R n R + } is called convex,
More informationVariable Metric Forward-Backward Algorithm
Variable Metric Forward-Backward Algorithm 1/37 Variable Metric Forward-Backward Algorithm for minimizing the sum of a differentiable function and a convex function E. Chouzenoux in collaboration with
More informationCoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles
CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles SIAM Student Research Conference Deanna Needell Joint work with Roman Vershynin and Joel Tropp UC Davis, May 2008 CoSaMP: Greedy Signal
More informationAdaptive discretization and first-order methods for nonsmooth inverse problems for PDEs
Adaptive discretization and first-order methods for nonsmooth inverse problems for PDEs Christian Clason Faculty of Mathematics, Universität Duisburg-Essen joint work with Barbara Kaltenbacher, Tuomo Valkonen,
More informationLearning with stochastic proximal gradient
Learning with stochastic proximal gradient Lorenzo Rosasco DIBRIS, Università di Genova Via Dodecaneso, 35 16146 Genova, Italy lrosasco@mit.edu Silvia Villa, Băng Công Vũ Laboratory for Computational and
More informationConvex Optimization. (EE227A: UC Berkeley) Lecture 15. Suvrit Sra. (Gradient methods III) 12 March, 2013
Convex Optimization (EE227A: UC Berkeley) Lecture 15 (Gradient methods III) 12 March, 2013 Suvrit Sra Optimal gradient methods 2 / 27 Optimal gradient methods We saw following efficiency estimates for
More informationLinear Convergence of Adaptively Iterative Thresholding Algorithms for Compressed Sensing
Linear Convergence of Adaptively Iterative Thresholding Algorithms for Compressed Sensing Yu Wang, Jinshan Zeng, Zhimin Peng, Xiangyu Chang, and Zongben Xu arxiv:48.689v3 [math.oc] 6 Dec 5 Abstract This
More informationLinear Convergence of Stochastic Iterative Greedy Algorithms with Sparse Constraints
Claremont Colleges Scholarship @ Claremont CMC Faculty Publications and Research CMC Faculty Scholarship 7--04 Linear Convergence of Stochastic Iterative Greedy Algorithms with Sparse Constraints Nam Nguyen
More informationProximal Methods for Optimization with Spasity-inducing Norms
Proximal Methods for Optimization with Spasity-inducing Norms Group Learning Presentation Xiaowei Zhou Department of Electronic and Computer Engineering The Hong Kong University of Science and Technology
More informationStability and Robustness of Weak Orthogonal Matching Pursuits
Stability and Robustness of Weak Orthogonal Matching Pursuits Simon Foucart, Drexel University Abstract A recent result establishing, under restricted isometry conditions, the success of sparse recovery
More informationStochastic Proximal Gradient Algorithm
Stochastic Institut Mines-Télécom / Telecom ParisTech / Laboratoire Traitement et Communication de l Information Joint work with: Y. Atchade, Ann Arbor, USA, G. Fort LTCI/Télécom Paristech and the kind
More informationA New Look at First Order Methods Lifting the Lipschitz Gradient Continuity Restriction
A New Look at First Order Methods Lifting the Lipschitz Gradient Continuity Restriction Marc Teboulle School of Mathematical Sciences Tel Aviv University Joint work with H. Bauschke and J. Bolte Optimization
More informationThe Pros and Cons of Compressive Sensing
The Pros and Cons of Compressive Sensing Mark A. Davenport Stanford University Department of Statistics Compressive Sensing Replace samples with general linear measurements measurements sampled signal
More informationSignal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit
Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit Deanna Needell and Roman Vershynin Abstract We demonstrate a simple greedy algorithm that can reliably
More informationOn the Sub-Optimality of Proximal Gradient Descent for l 0 Sparse Approximation
On the Sub-Optimality of Proximal Gradient Descent for l 0 Sparse Approximation Yingzhen Yang YYANG58@ILLINOIS.EDU Jianchao Yang JIANCHAO.YANG@SNAPCHAT.COM Snapchat, Venice, CA 90291 Wei Han WEIHAN3@ILLINOIS.EDU
More informationSensing systems limited by constraints: physical size, time, cost, energy
Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original
More informationA Unified Approach to Proximal Algorithms using Bregman Distance
A Unified Approach to Proximal Algorithms using Bregman Distance Yi Zhou a,, Yingbin Liang a, Lixin Shen b a Department of Electrical Engineering and Computer Science, Syracuse University b Department
More informationA tutorial on sparse modeling. Outline:
A tutorial on sparse modeling. Outline: 1. Why? 2. What? 3. How. 4. no really, why? Sparse modeling is a component in many state of the art signal processing and machine learning tasks. image processing
More informationModel Selection with Partly Smooth Functions
Model Selection with Partly Smooth Functions Samuel Vaiter, Gabriel Peyré and Jalal Fadili vaiter@ceremade.dauphine.fr August 27, 2014 ITWIST 14 Model Consistency of Partly Smooth Regularizers, arxiv:1405.1004,
More informationOWL to the rescue of LASSO
OWL to the rescue of LASSO IISc IBM day 2018 Joint Work R. Sankaran and Francis Bach AISTATS 17 Chiranjib Bhattacharyya Professor, Department of Computer Science and Automation Indian Institute of Science,
More informationLarge-Scale L1-Related Minimization in Compressive Sensing and Beyond
Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March
More informationGradient Descent with Sparsification: An iterative algorithm for sparse recovery with restricted isometry property
: An iterative algorithm for sparse recovery with restricted isometry property Rahul Garg grahul@us.ibm.com Rohit Khandekar rohitk@us.ibm.com IBM T. J. Watson Research Center, 0 Kitchawan Road, Route 34,
More informationCompressed Sensing - Near Optimal Recovery of Signals from Highly Incomplete Measurements
Compressed Sensing - Near Optimal Recovery of Signals from Highly Incomplete Measurements Wolfgang Dahmen Institut für Geometrie und Praktische Mathematik RWTH Aachen and IMI, University of Columbia, SC
More informationSketching for Large-Scale Learning of Mixture Models
Sketching for Large-Scale Learning of Mixture Models Nicolas Keriven Université Rennes 1, Inria Rennes Bretagne-atlantique Adv. Rémi Gribonval Outline Introduction Practical Approach Results Theoretical
More informationEE 381V: Large Scale Optimization Fall Lecture 24 April 11
EE 381V: Large Scale Optimization Fall 2012 Lecture 24 April 11 Lecturer: Caramanis & Sanghavi Scribe: Tao Huang 24.1 Review In past classes, we studied the problem of sparsity. Sparsity problem is that
More informationRobust multichannel sparse recovery
Robust multichannel sparse recovery Esa Ollila Department of Signal Processing and Acoustics Aalto University, Finland SUPELEC, Feb 4th, 2015 1 Introduction 2 Nonparametric sparse recovery 3 Simulation
More informationSparse Optimization: Algorithms and Applications. Formulating Sparse Optimization. Motivation. Stephen Wright. Caltech, 21 April 2007
Sparse Optimization: Algorithms and Applications Stephen Wright 1 Motivation and Introduction 2 Compressed Sensing Algorithms University of Wisconsin-Madison Caltech, 21 April 2007 3 Image Processing +Mario
More informationof Orthogonal Matching Pursuit
A Sharp Restricted Isometry Constant Bound of Orthogonal Matching Pursuit Qun Mo arxiv:50.0708v [cs.it] 8 Jan 205 Abstract We shall show that if the restricted isometry constant (RIC) δ s+ (A) of the measurement
More informationSTAT 200C: High-dimensional Statistics
STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 57 Table of Contents 1 Sparse linear models Basis Pursuit and restricted null space property Sufficient conditions for RNS 2 / 57
More informationOptimization for Learning and Big Data
Optimization for Learning and Big Data Donald Goldfarb Department of IEOR Columbia University Department of Mathematics Distinguished Lecture Series May 17-19, 2016. Lecture 1. First-Order Methods for
More informationECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference
ECE 18-898G: Special Topics in Signal Processing: Sparsity, Structure, and Inference Low-rank matrix recovery via convex relaxations Yuejie Chi Department of Electrical and Computer Engineering Spring
More informationEE 546, Univ of Washington, Spring Proximal mapping. introduction. review of conjugate functions. proximal mapping. Proximal mapping 6 1
EE 546, Univ of Washington, Spring 2012 6. Proximal mapping introduction review of conjugate functions proximal mapping Proximal mapping 6 1 Proximal mapping the proximal mapping (prox-operator) of a convex
More informationTutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London
Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation Pier Luigi Dragotti Imperial College London Outline Part 1: Sparse Signal Representation ~90min Part 2: Sparse Sampling ~90min 2
More informationProblem Set 6: Solutions Math 201A: Fall a n x n,
Problem Set 6: Solutions Math 201A: Fall 2016 Problem 1. Is (x n ) n=0 a Schauder basis of C([0, 1])? No. If f(x) = a n x n, n=0 where the series converges uniformly on [0, 1], then f has a power series
More informationProximal methods. S. Villa. October 7, 2014
Proximal methods S. Villa October 7, 2014 1 Review of the basics Often machine learning problems require the solution of minimization problems. For instance, the ERM algorithm requires to solve a problem
More informationDetecting Sparse Structures in Data in Sub-Linear Time: A group testing approach
Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach Boaz Nadler The Weizmann Institute of Science Israel Joint works with Inbal Horev, Ronen Basri, Meirav Galun and Ery Arias-Castro
More informationInvariancy of Sparse Recovery Algorithms
Invariancy of Sparse Recovery Algorithms Milad Kharratzadeh, Arsalan Sharifnassab, and Massoud Babaie-Zadeh Abstract In this paper, a property for sparse recovery algorithms, called invariancy, is introduced.
More informationOn the Minimization Over Sparse Symmetric Sets: Projections, O. Projections, Optimality Conditions and Algorithms
On the Minimization Over Sparse Symmetric Sets: Projections, Optimality Conditions and Algorithms Amir Beck Technion - Israel Institute of Technology Haifa, Israel Based on joint work with Nadav Hallak
More informationRobust Sparse Recovery via Non-Convex Optimization
Robust Sparse Recovery via Non-Convex Optimization Laming Chen and Yuantao Gu Department of Electronic Engineering, Tsinghua University Homepage: http://gu.ee.tsinghua.edu.cn/ Email: gyt@tsinghua.edu.cn
More information2 Regularized Image Reconstruction for Compressive Imaging and Beyond
EE 367 / CS 448I Computational Imaging and Display Notes: Compressive Imaging and Regularized Image Reconstruction (lecture ) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement
More informationLinear convergence of iterative soft-thresholding
arxiv:0709.1598v3 [math.fa] 11 Dec 007 Linear convergence of iterative soft-thresholding Kristian Bredies and Dirk A. Lorenz ABSTRACT. In this article, the convergence of the often used iterative softthresholding
More informationarxiv: v2 [cs.lg] 6 May 2017
arxiv:170107895v [cslg] 6 May 017 Information Theoretic Limits for Linear Prediction with Graph-Structured Sparsity Abstract Adarsh Barik Krannert School of Management Purdue University West Lafayette,
More informationMonotone Operator Splitting Methods in Signal and Image Recovery
Monotone Operator Splitting Methods in Signal and Image Recovery P.L. Combettes 1, J.-C. Pesquet 2, and N. Pustelnik 3 2 Univ. Pierre et Marie Curie, Paris 6 LJLL CNRS UMR 7598 2 Univ. Paris-Est LIGM CNRS
More informationORTHOGONAL matching pursuit (OMP) is the canonical
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 9, SEPTEMBER 2010 4395 Analysis of Orthogonal Matching Pursuit Using the Restricted Isometry Property Mark A. Davenport, Member, IEEE, and Michael
More informationBias-free Sparse Regression with Guaranteed Consistency
Bias-free Sparse Regression with Guaranteed Consistency Wotao Yin (UCLA Math) joint with: Stanley Osher, Ming Yan (UCLA) Feng Ruan, Jiechao Xiong, Yuan Yao (Peking U) UC Riverside, STATS Department March
More informationInterpolation-Based Trust-Region Methods for DFO
Interpolation-Based Trust-Region Methods for DFO Luis Nunes Vicente University of Coimbra (joint work with A. Bandeira, A. R. Conn, S. Gratton, and K. Scheinberg) July 27, 2010 ICCOPT, Santiago http//www.mat.uc.pt/~lnv
More informationDual methods for the minimization of the total variation
1 / 30 Dual methods for the minimization of the total variation Rémy Abergel supervisor Lionel Moisan MAP5 - CNRS UMR 8145 Different Learning Seminar, LTCI Thursday 21st April 2016 2 / 30 Plan 1 Introduction
More information