The rise and fall of regularizations: l1 vs. l2 representer theorems

Size: px
Start display at page:

Download "The rise and fall of regularizations: l1 vs. l2 representer theorems"

Transcription

1 The rise and fall of regularizations: l1 vs. l2 representer theorems Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Joint work with Julien Fageot, John-Paul Ward, and Harshit Gupta London Workshop on Sparse Signal Processing, Sept. 2016, Imperial College. OUTLINE Linear inverse problems and regularization Tikhonov regularization The sparsity (r)evolution Compressed sensing and l1 imization Part I: Discrete-domain regularization (l2 vs. l1) Part II: Continuous-domain regularization (L2 vs. gtv) Classical L2 regularization: theory of RKHS Splines and operators Minimization of gtv: the optimality of splines Enabling components for the proof Special case TV in 1D 2

2 Inverse problems in bio-imaging Linear forward model noise y = Hs + n H s Integral operator n Problem: recover s from noisy measurements y The easy scenario Basic limitations Inverse problem is well posed if 9c 0 > 0 s.t., for all s 2 X, c 0 ksk applekhsk ) s H 1 y Backprojection (poor man s solution): 1) Inherent noise amplification 2) Difficulty to invert H (too large or non-square) 3) All interesting inverse problems are ill-posed s H T y 3 Linear inverse problems (20th century theory) Dealing with ill-posed problems: Tikhonov regularization R(s) =klsk 2 2 : regularization (or smoothness) functional L: regularization operator (i.e., Gradient) s R(s) subject to ky Hsk 2 2 apple 2 Equivalent variational problem Andrey N. Tikhonov ( ) s? = arg ky Hsk klsk 2 2 {z } {z } data consistency regularization Formal linear solution: s =(H= T H + L T L) 1 H T y = R y Interpretation: filtered backprojection 4

3 Linear inverse problems: The sparsity (r)evolution (20th Century) p =2! 1 (21st Century) s rec = arg s ky Hsk R(s) Non-quadratic regularization regularization R(s) =klsk 2`2! klsk p`p! klsk`1 Total variation (Rudin-Osher, 1992) R(s) =klsk`1 with L: gradient Wavelet-domain regularization (Figuereido et al., Daubechies et al. 2004) v = W 1 s: wavelet epansion of s (typically, sparse) R(s) =kvk`1 Compressed sensing/sampling (Candes-Romberg-Tao; Donoho, 2006) 5 Compressive sensing (CS) and l1 imization y A [Donoho et al., 2005 Candès-Tao, 2006,...] + noise Sparse representation of signal: s = W with kk 0 = K N Equivalent N y N sensing matri : A = HW Constrained (synthesis) formulation of recovery problem kk 1 subject to ky Ak 2 2 apple 2 6

4 CS: Three fundamental ingredients (Donoho, IEEE T. Inf. Theo. 2006) (Candès-Romberg, Inv. Prob. 2007) 1. Eistence of sparsifying transform (W or L) - Wavelet basis - Dictionary - Differential operator (Gradient) 2. Incoherence of sensing matri A - Restricted isometry; few linearly dependent columns (spark) - Quasi-random and delocalized structure: Gaussian matri with i.i.d. entries, random sampling in Fourier domain 3. Non-linear signal recovery (l 1 imization) 7 CS: Eamples of applications in imaging - Magnetic resonance imaging (MRI) (Lustig, Mag. Res. Im. 2007) - Radio Interferometry - Teraherz Imaging (Wiau, Notic. R. Astro. 2007) (Chan, Appl. Phys. 2008) - Digital holography (Brady, Opt. Epress 2009; Marim 2010) - Spectral-domain OCT (Liu, Opt. Epress 2010) - Coded-aperture spectral imaging (Arce, IEEE Sig. Proc. 2014) - Localization microscopy (Zhu, Nat. Meth. 2012) - Ultrafast photography (Gao, Nature 2014) 8

5 Part I: Discrete-domain regularization 9 Classical regularized least-squares estimator Linear measurement model: y m = hh m, i + n[m], m =1,...,M System matri of size M N : H =[h 1 h M ] T LS = arg 2R N ky Hk kk 2 2 ) LS =(H T H + I N ) 1 H T y MX = H T a = a m h m where a =(HH T + I M ) 1 y m=1 Interpretation: LS 2 span{h m } M m=1 Lemma (H T H + I N ) 1 H T = H T (HH T + I M ) 1 10

6 Generalization: constrained l2 imization Discrete signal to reconstruct: =([n]) n2z Sensing operator H:`2(Z)! R M 7! z =H{} =(h, h 1 i,...,h, h M i) with h m 2 `2(Z) Closed conve set in measurement space: C R M Eample: C y = {z 2 R M : ky zk 2 2 apple 2 } Representer theorem for constrained `2 imization (P2) 2`2(Z) kk2`2 s.t. H{} 2 C The problem (P2) has a unique solution of the form MX LS = a m h m =H {a} m=1 with epansion coefficients a =(a 1,,a M ) 2 R M. (U.-Fageot-Gupta IEEE Trans. Info. Theory, Sept. 2016) 11 Constrained l1 imization sparsifying effect Discrete signal to reconstruct: =([n]) n2z Sensing operator H:`1(Z)! R M 7! z =H{} =(h, h 1 i,...,h, h M i) with h m 2 `1(Z) Closed conve set in measurement space: C R M Representer theorem for constrained `1 imization (P1) V = arg 2`1(Z) kk`1 s.t. H{} 2 C is conve, weak*-compact with etreme points of the form KX sparse [ ] = a k [ n k ] with K = k sparse k 0 apple M. k=1 V If CS condition is satisfied, then solution is unique (U.-Fageot-Gupta IEEE Trans. Info. Theory, Sept. 2016) 12

7 Controlling sparsity Measurement model: y m = hh m,i + n[m], m =1,...,M sparse = arg 2`1(Z) MX m=1 y m hh m,i 2 + kk`1! Sparsity Inde (K) Conv. DCT CS de (K) λ a): Sparse model 50 Conv. DCT 40 CS Geometry of l2 vs. l1 imization Prototypical inverse problem ky Hk 2`2 + kk 2`2, kk`2 subject to ky Hk 2`2 apple 2 ky Hk 2`2 + kk`1, kk`1 subject to ky Hk 2`2 apple 2 2 C y 1 = h T 1 y 2 1 `2-ball: = C 2 `1-ball: = C 1 14

8 Geometry of l2 vs. l1 imization Prototypical inverse problem ky Hk 2`2 + kk 2`2, kk`2 subject to ky Hk 2`2 apple 2 ky Hk 2`2 + kk`1, kk`1 subject to ky Hk 2`2 apple 2 2 C y 1 = h T 1 sparse etreme points 1 `2-ball: = C 2 `1-ball: = C 1 Configuration for non-unique `1 solution 15 Part II: Continuous-domain regularization 16

9 Part II: Continuous-domain regularization Photo courtesy of Carl De Boor 17 Continuous-domain regularization (L2 scenario) Z Regularization functional: klfk 2 L 2 = Lf() 2 d R d L: suitable differential operator Theory of reproducing kernel Hilbert spaces hf,gi H = hlf,lgi (Aronszajn 1950) Interpolation and approimation theory Smoothing splines (Schoenberg 1964, Kimeldorf-Wahba 1971) Thin-plate splines, radial basis functions (Duchon 1977) Machine learning Radial basis functions, kernel methods Representer theorem(s) (Poggio-Girosi 1990) (Schölkopf-Smola 2001) 18

10 Representer theorem for L2 regularization (P2) arg f2h! MX y m f( m ) 2 + kfk 2 H m=1 h : R d R d! R is the (unique) reproducing kernel for the Hilbert space H(R d ) if (i) h( 0, ) 2 H for all 0 2 R d (ii) f( 0 )=hh( 0, ),fi H for all f 2 H and 0 2 R d Conve loss function: F : R M R M! R Sample values: f = f( 1 ),...,f( M ) (P2 ) arg f2h F (y, f)+ kfk2 H (Schölkopf-Smola 2001) Representer theorem for L 2 -regularization The generic parametric form of the solution of (P2 ) is MX f() = a m h(, m ) m=1 Supports the theory of SVM, kernel methods, etc. 19 Sparsity and continuous-domain modeling Compressed sensing (CS) Generalized sampling and infinite-dimensional CS Xampling: CS of analog signals (Adcock-Hansen, 2011) (Eldar, 2011) Splines and approimation theory L 1 splines Locally-adaptive regression splines Generalized TV (Fisher-Jerome, 1975) (Mammen-van de Geer, 1997) (Steidl et al. 2005; Bredies et al. 2010) Statistical modeling Sparse stochastic processes (Unser et al ) 20

11 Geometry of l2 vs. l1 imization Prototypical inverse problem ky Hk 2`2 + kk 2`2, kk`2 subject to ky Hk 2`2 apple 2 ky Hk 2`2 + kk`1, kk`1 subject to ky Hk 2`2 apple 2 2 C y 1 = h T 1 y Geometry of L2 vs. TV imization Prototypical inverse problem s ky H{s}k 2`2 + kl{s}k 2 L 2, s kl{s}k 2 L 2 subject to ky H{s}k 2`2 apple 2 s ky H{s}k 2`2 + kl{s}k TV, s kl{s}k TV subject to ky H{s}k 2`2 apple 2 2 C y y 1 = hh 1,si L L-splines with M fied knots (H: pure sampling operator) 2 1 L-splines with few adaptive knots (H: can be arbitrary) 22

12 Splines are analog and intrinsically sparse L{ }: admissible differential operator ( 0 ): Dirac impulse shifted by 0 2 R d Definition The function s : R d! R is a (non-uniform) L-spline with knots ( k ) K k=1 if KX L{s} = a k ( k )=w : spline s innovation k=1 Spline theory: (Schultz-Varga, 1967) a k L= d d k k+1 FRI signal processing: Innovation variables (2K) (Vetterli et al., 2002) Location of singularities (knots) : { k } K k=1 Strength of singularities (linear weights): {a k } K k=1 23 Spline synthesis: eample L=D= d d D () =D 1 { }() = Null space: N D = span{p 1 }, p 1 () =1 + (): Heaviside function w () = X k a k ( k ) 1 s() =b 1 p 1 ()+ X k a k + ( k ) b 1 a 1 24

13 Spline synthesis: generalization L: spline admissible operator (LSI) L () =L 1 { }: Green s function of L Finite-dimensional null space: N L = span{p n } N 0 n=1 Spline s innovation: w () = X k a k ( k ) ) s() = X k a k L ( k )+ XN 0 n=1 b n p n () Requires specification of boundary conditions a 1 25 Principled operator-based approach Biorthogonal basis of N L =span{p n } N 0 n=1 =( 1,, N 0 ) such that h m,p n i = m,n XN 0 Projection operator: p = h n,pip n n=1 Operator-based spline synthesis for all p 2 N L Boundary conditions: hs, ni = b n,n=1,,n 0 Spline s innovation: L{s} = w = X k a k ( k ) s() =L 1 {w }()+ XN 0 n=1 b n p n () Eistence of L 1 as a stable right-inverse of L? (see Theorem 1) LL 1 w = w (L 1 w)=0 26

14 Beyond splines: function spaces for gtv Photo courtesy of Carl De Boor 27 From Dirac impulses to Borel measures S(R d ): Schwartz s space of smooth and rapidly decaying test functions on R d S 0 (R d ): Schwartz s space of tempered distributions Space of real-valued, countably additive Borel measures on R d M(R d )= C 0 (R d ) 0 = w 2 S 0 (R d ):kwk M = sup hw, 'i < 1, '2S(R d ):k'k 1 =1 where w : ' 7! hw, 'i = R R d '(r)w(r)dr Equivalent definition of total variation norm kwk M = sup hw, 'i '2C 0 (R d ):k'k 1 =1 Basic inclusions ( 0 ) 2 M(R d ) with k ( 0 )k M =1for any 0 2 R d kfk M = kfk L1 (R d ) for all f 2 L 1 (R d ) ) L 1 (R d ) M(R d ) 28

15 Optimality result for Dirac measures F: linear continuous map M(R d )! R M C: conve compact subset of R M Generic constrained TV imization problem V = arg w2m(r d ):F(w)2C kwk M Generalized Fisher-Jerome theorem The solution set V is a conve, weak -compact subset of M(R d ) with etremal points of the form KX w = a k ( k ) k=1 with K apple M and k 2 R d. (U.-Fageot-Ward, ArXiv 2016) Jerome-Fisher, 1975: Compact domain & scalar intervals 29 General conve problems with gtv regularization M L (R d )= s : gtv(s) =kl{s}k M = sup hl{s}, 'i < 1 k'k 1 apple1 Linear measurement operator M L (R d )! R M : f 7! z =H{f} C: conve compact subset of R M Finite-dimensional null space N L = {q 2 M L (R d ):L{q} =0} with basis {p n } N 0 n=1 Admissibility of regularization: H{q 1 } =H{q 2 }, q 1 = q 2 for all q 1,q 2 2 N L Representer theorem for gtv regularization The etremal points of the constrained imization problem V = arg f2m L (R d ) kl{f}k M s.t. H{f} 2 C KX XN 0 are necessarily of the form f() = a k L ( k )+ b n p n () with K apple k=1 M N 0 ; that is, non-uniform L-splines with knots at the k and kl{f}k M = P k=1 a k. The full solution set is the conve hull of those etremal points. n=1 V (U.-Fageot-Ward, ArXiv 2016) 30

16 Enabling components for proof of the theorem 31 Eistence of stable right-inverse operator L 1,n0 (R d )={f : R d! R : sup f() (1 + kk) n 0 < +1} 2R d Theorem 1 (U.-Fageot-Ward, ArXiv 2016) Let L be a spline-admissible operator with a N 0 -dimensional null space N L L 1,n0 (R d ) such that p = P N 0 n=1 hp, nip n for all p 2 N L. Then, there eists a unique and stable operator L 1 : M(R d )! L 1,n0 (R d ) such that, for all w 2 M(R d ), Right-inverse property: LL 1 w = w, Boundary conditions: (L 1 w)=0 with =( 1,, N 0 ). Its generalized impulse response g (, y) =L 1 { ( y)}() is given by g (, y) = L ( y) XN 0 n=1 p n ()q n (y) with L such that L{ L } = and q n (y) =h n, L ( y)i. 32

17 Characterization of generalized Beppo-Levi spaces Regularization operator L:M L (R d )! M(R d ) f 2 M L (R d ), gtv(f) =kl{f}k M < 1 Theorem 2 (U.-Fageot-Ward, ArXiv 2016) Let L be a spline-admissible operator that admits a stable right-inverse L 1 of the form specified by Theorem 1. Then, any f 2 M L (R d ) has a unique representation as f =L 1 w + p, where w =L{f} 2 M(R d ) and p = P N 0 n=1 h n,fip n 2 N L with n 2 M L (R d ) 0. Moreover, M L (R d ) is a Banach space equipped with the norm kfk L, = klfk M + k (f)k 2. Generalized Beppo-Levi space: M L (R d )=M L, (R d ) N L M L, (R d )= f 2 M L (R d ): (f) =0 N L = p 2 M L (R d ):L{p} =0 33 Eample: Conve problem with TV regularization L=D= d d N D = span{p 1 }, p 1 () =1 D () = + (): Heaviside function General linear-inverse problem with TV regularization kd{s}k M s.t. H{s} =(hh 1,si,, hh M,si) 2 C(y) s2m D (R) Generic form of the solution (by Theorem 4) KX s() =b 1 + a k + ( k ) no penalty k=1 b 1 a 1 with K<Mand free parameters b 1 and (a k, k ) K k=1 34

18 SUMMARY: Sparsity in infinite dimensions Discrete-domain formulation Contrasting behavior of l 1 vs. l 2 regularization Minimization of l 1 favors sparse solutions (independently of sensing matri) Continuous-domain formulation Linear measurement model Linear signal model: PDE Ls = w s 2 X s 7! z =H{s} ) s =L 1 w L-splines = signals with sparsest innovation gtv(s) =klsk M Deteristic optimality result gtv regularization: favors sparse innovations Non-uniform L-splines: universal solutions of linear inverse problems 35 Acknowledgments Many thanks to (former) members of EPFL s Biomedical Imaging Group Dr. Pouya Tafti Prof. Arash Ai Dr. Emrah Bostan Dr. Masih Nilchian Dr. Ulugbek Kamilov Dr. Cédric Vonesch... and collaborators... Prof. Demetri Psaltis Prof. Marco Stampanoni Prof. Carlos-Oscar Sorzano Dr. Arne Seitz... Preprints and demos:

19 References New results on sparsity-promoting regularization M. Unser, J. Fageot, H. Gupta, Representer Theorems for Sparsity-Promoting `1 Regularization, IEEE Trans. Information Theory, Vol. 62, No. 9, pp M. Unser, J. Fageot, J.P. Ward, Splines Are Universal Solutions of Linear Inverse Problems with Generalized-TV Regularization, arxiv: [math.fa]. Theory of sparse stochastic processes M. Unser and P. Tafti, An Introduction to Sparse Stochastic Processes, Cambridge University Press, Preprint, available at For splines: see chapter 6 Algorithms and imaging applications E. Bostan, U.S. Kamilov, M. Nilchian, M. Unser, Sparse Stochastic Processes and Discretization of Linear Inverse Problems, IEEE Trans. Image Processing, vol. 22, no. 7, pp , C. Vonesch, M. Unser, A Fast Multilevel Algorithm for Wavelet-Regularized Image Restoration, IEEE Trans. Image Processing, vol. 18, no. 3, pp , March M. Nilchian, C. Vonesch, S. Lefkimmiatis, P. Modregger, M. Stampanoni, M. Unser, Constrained Regularized Reconstruction of X-Ray-DPCI Tomograms with Weighted-Norm, Optics Epress, vol. 21, no. 26, pp , Link with Total variation of Rudin-Osher Total variation of function 6= total variation of a measure Total variation in 1D: TV(f) =sup k'k1 apple1hdf,'i = kdfk M perfect equivalence (with L=D) Usual total variation in 2D: TV(f) =sup k'k1 apple1hrf,'i Problem: r y ) is not a scalar operator L 1 version of the 2D total variation: Z TV(f) = rf(, y) ddy, = 1 R 4 2 Z 2 0 kd u fk M d angular averaging (rotation invariance) D u f = hu, rfi: directional derivative of f along u = (cos, sin ) Present theory eplains the regularisation effect of kd u fk M 38

New representer theorems: From compressed sensing to deep learning

New representer theorems: From compressed sensing to deep learning New representer theorems: From compressed sensing to deep learning Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Joint work with Julien Fageot, John-Paul Ward and Kyong Jin Mathematisches

More information

SPLINES ARE UNIVERSAL SOLUTIONS OF LINEAR INVERSE PROBLEMS WITH GENERALIZED-TV REGULARIZATION

SPLINES ARE UNIVERSAL SOLUTIONS OF LINEAR INVERSE PROBLEMS WITH GENERALIZED-TV REGULARIZATION SPLINES ARE UNIVERSAL SOLUTIONS OF LINEAR INVERSE PROBLEMS WITH GENERALIZED-TV REGULARIZATION MICHAEL UNSER, JULIEN FAGEOT, AND JOHN PAUL WARD Abstract. Splines come in a variety of flavors that can be

More information

arxiv: v3 [math.fa] 8 Dec 2016

arxiv: v3 [math.fa] 8 Dec 2016 Splines are universal solutions of linear inverse problems with generalized-tv regularization Michael Unser Julien Fageot John Paul Ward arxiv:1603.01427v3 [math.fa] 8 Dec 2016 December 9, 2016 Abstract

More information

Splines are universal solutions of linear inverse problems with generalized-tv regularization

Splines are universal solutions of linear inverse problems with generalized-tv regularization Splines are universal solutions of linear inverse problems with generalized-tv regularization Michael Unser Julien Fageot John Paul Ward arxiv:1603.01427v2 [math.fa] 21 Oct 2016 September 18, 2018 Abstract

More information

Sparse stochastic processes: A statistical framework for modern signal processing

Sparse stochastic processes: A statistical framework for modern signal processing Sparse stochastic processes: A statistical framework for modern signal processing Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Plenary talk, Int. Conf. Syst. Sig. Im. Proc. (IWSSIP),

More information

Sparse stochastic processes

Sparse stochastic processes BIOMEDICAL IMAGING GROUP (BIG) LABORATOIRE D IMAGERIE BIOMEDICALE EPFL LIB Bât. BM 4.127 CH 115 Lausanne Switzerland Téléphone : Fax : E-mail : Site web : + 4121 693 51 5 + 4121 693 37 1 Michael.Unser@epfl.ch

More information

Lecture Notes 9: Constrained Optimization

Lecture Notes 9: Constrained Optimization Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form

More information

Strengthened Sobolev inequalities for a random subspace of functions

Strengthened Sobolev inequalities for a random subspace of functions Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)

More information

Sparse modeling and the resolution of inverse problems in biomedical imaging

Sparse modeling and the resolution of inverse problems in biomedical imaging Sparse modeling and the resolution of inverse problems in biomedical imaging Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Plenary talk, IEEE Int. Symp. Biomedical Imaging (ISBI 5),

More information

On the coherence barrier and analogue problems in compressed sensing

On the coherence barrier and analogue problems in compressed sensing On the coherence barrier and analogue problems in compressed sensing Clarice Poon University of Cambridge June 1, 2017 Joint work with: Ben Adcock Anders Hansen Bogdan Roman (Simon Fraser) (Cambridge)

More information

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Jorge F. Silva and Eduardo Pavez Department of Electrical Engineering Information and Decision Systems Group Universidad

More information

Color Scheme. swright/pcmi/ M. Figueiredo and S. Wright () Inference and Optimization PCMI, July / 14

Color Scheme.   swright/pcmi/ M. Figueiredo and S. Wright () Inference and Optimization PCMI, July / 14 Color Scheme www.cs.wisc.edu/ swright/pcmi/ M. Figueiredo and S. Wright () Inference and Optimization PCMI, July 2016 1 / 14 Statistical Inference via Optimization Many problems in statistical inference

More information

Multiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing

Multiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing Multiplicative and Additive Perturbation Effects on the Recovery of Sparse Signals on the Sphere using Compressed Sensing ibeltal F. Alem, Daniel H. Chae, and Rodney A. Kennedy The Australian National

More information

Sparse stochastic processes with application to biomedical imaging

Sparse stochastic processes with application to biomedical imaging Sparse stochastic processes with application to biomedical imaging Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Plenary talk, ICIAR, October -, Vilamoura, Algarve, Portugal Variational

More information

Mathematical Methods for Data Analysis

Mathematical Methods for Data Analysis Mathematical Methods for Data Analysis Massimiliano Pontil Istituto Italiano di Tecnologia and Department of Computer Science University College London Massimiliano Pontil Mathematical Methods for Data

More information

A Unified Formulation of Gaussian Versus Sparse Stochastic Processes

A Unified Formulation of Gaussian Versus Sparse Stochastic Processes A Unified Formulation of Gaussian Versus Sparse Stochastic Processes Michael Unser, Pouya Tafti and Qiyu Sun EPFL, UCF Appears in IEEE Trans. on Information Theory, 2014 Presented by Liming Wang M. Unser,

More information

AN INTRODUCTION TO COMPRESSIVE SENSING

AN INTRODUCTION TO COMPRESSIVE SENSING AN INTRODUCTION TO COMPRESSIVE SENSING Rodrigo B. Platte School of Mathematical and Statistical Sciences APM/EEE598 Reverse Engineering of Complex Dynamical Networks OUTLINE 1 INTRODUCTION 2 INCOHERENCE

More information

Compressed sensing and imaging

Compressed sensing and imaging Compressed sensing and imaging The effect and benefits of local structure Ben Adcock Department of Mathematics Simon Fraser University 1 / 45 Overview Previously: An introduction to compressed sensing.

More information

Regularization in Reproducing Kernel Banach Spaces

Regularization in Reproducing Kernel Banach Spaces .... Regularization in Reproducing Kernel Banach Spaces Guohui Song School of Mathematical and Statistical Sciences Arizona State University Comp Math Seminar, September 16, 2010 Joint work with Dr. Fred

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

Constrained optimization

Constrained optimization Constrained optimization DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Compressed sensing Convex constrained

More information

Super-resolution by means of Beurling minimal extrapolation

Super-resolution by means of Beurling minimal extrapolation Super-resolution by means of Beurling minimal extrapolation Norbert Wiener Center Department of Mathematics University of Maryland, College Park http://www.norbertwiener.umd.edu Acknowledgements ARO W911NF-16-1-0008

More information

The Analog Formulation of Sparsity Implies Infinite Divisibility and Rules Out Bernoulli-Gaussian Priors

The Analog Formulation of Sparsity Implies Infinite Divisibility and Rules Out Bernoulli-Gaussian Priors The Analog Formulation of Sparsity Implies Infinite Divisibility and ules Out Bernoulli-Gaussian Priors Arash Amini, Ulugbek S. Kamilov, and Michael Unser Biomedical Imaging Group BIG) École polytechnique

More information

Reproducing Kernel Hilbert Spaces Class 03, 15 February 2006 Andrea Caponnetto

Reproducing Kernel Hilbert Spaces Class 03, 15 February 2006 Andrea Caponnetto Reproducing Kernel Hilbert Spaces 9.520 Class 03, 15 February 2006 Andrea Caponnetto About this class Goal To introduce a particularly useful family of hypothesis spaces called Reproducing Kernel Hilbert

More information

RKHS, Mercer s theorem, Unbounded domains, Frames and Wavelets Class 22, 2004 Tomaso Poggio and Sayan Mukherjee

RKHS, Mercer s theorem, Unbounded domains, Frames and Wavelets Class 22, 2004 Tomaso Poggio and Sayan Mukherjee RKHS, Mercer s theorem, Unbounded domains, Frames and Wavelets 9.520 Class 22, 2004 Tomaso Poggio and Sayan Mukherjee About this class Goal To introduce an alternate perspective of RKHS via integral operators

More information

Signal Recovery from Permuted Observations

Signal Recovery from Permuted Observations EE381V Course Project Signal Recovery from Permuted Observations 1 Problem Shanshan Wu (sw33323) May 8th, 2015 We start with the following problem: let s R n be an unknown n-dimensional real-valued signal,

More information

Super-resolution via Convex Programming

Super-resolution via Convex Programming Super-resolution via Convex Programming Carlos Fernandez-Granda (Joint work with Emmanuel Candès) Structure and Randomness in System Identication and Learning, IPAM 1/17/2013 1/17/2013 1 / 44 Index 1 Motivation

More information

TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS

TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS TRACKING SOLUTIONS OF TIME VARYING LINEAR INVERSE PROBLEMS Martin Kleinsteuber and Simon Hawe Department of Electrical Engineering and Information Technology, Technische Universität München, München, Arcistraße

More information

Generalized Orthogonal Matching Pursuit- A Review and Some

Generalized Orthogonal Matching Pursuit- A Review and Some Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents

More information

An Introduction to Sparse Approximation

An Introduction to Sparse Approximation An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,

More information

Reconstruction from Anisotropic Random Measurements

Reconstruction from Anisotropic Random Measurements Reconstruction from Anisotropic Random Measurements Mark Rudelson and Shuheng Zhou The University of Michigan, Ann Arbor Coding, Complexity, and Sparsity Workshop, 013 Ann Arbor, Michigan August 7, 013

More information

Stochastic Models for Sparse and Piecewise-Smooth Signals

Stochastic Models for Sparse and Piecewise-Smooth Signals IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 59, NO. 3, MARCH 2011 989 Stochastic Models for Sparse and Piecewise-Smooth Signals Michael Unser, Fellow, IEEE, and Pouya Dehghani Tafti, Member, IEEE Abstract

More information

Learning MMSE Optimal Thresholds for FISTA

Learning MMSE Optimal Thresholds for FISTA MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Learning MMSE Optimal Thresholds for FISTA Kamilov, U.; Mansour, H. TR2016-111 August 2016 Abstract Fast iterative shrinkage/thresholding algorithm

More information

Recovering overcomplete sparse representations from structured sensing

Recovering overcomplete sparse representations from structured sensing Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix

More information

1-Bit Compressive Sensing

1-Bit Compressive Sensing 1-Bit Compressive Sensing Petros T. Boufounos, Richard G. Baraniuk Rice University, Electrical and Computer Engineering 61 Main St. MS 38, Houston, TX 775 Abstract Compressive sensing is a new signal acquisition

More information

arxiv: v1 [math.na] 26 Nov 2009

arxiv: v1 [math.na] 26 Nov 2009 Non-convexly constrained linear inverse problems arxiv:0911.5098v1 [math.na] 26 Nov 2009 Thomas Blumensath Applied Mathematics, School of Mathematics, University of Southampton, University Road, Southampton,

More information

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications Class 19: Data Representation by Design What is data representation? Let X be a data-space X M (M) F (M) X A data representation

More information

Reproducing Kernel Hilbert Spaces

Reproducing Kernel Hilbert Spaces Reproducing Kernel Hilbert Spaces Lorenzo Rosasco 9.520 Class 03 February 12, 2007 About this class Goal To introduce a particularly useful family of hypothesis spaces called Reproducing Kernel Hilbert

More information

Optimisation Combinatoire et Convexe.

Optimisation Combinatoire et Convexe. Optimisation Combinatoire et Convexe. Low complexity models, l 1 penalties. A. d Aspremont. M1 ENS. 1/36 Today Sparsity, low complexity models. l 1 -recovery results: three approaches. Extensions: matrix

More information

Overview. Optimization-Based Data Analysis. Carlos Fernandez-Granda

Overview. Optimization-Based Data Analysis.   Carlos Fernandez-Granda Overview Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 1/25/2016 Sparsity Denoising Regression Inverse problems Low-rank models Matrix completion

More information

Beyond incoherence and beyond sparsity: compressed sensing in the real world

Beyond incoherence and beyond sparsity: compressed sensing in the real world Beyond incoherence and beyond sparsity: compressed sensing in the real world Clarice Poon 1st November 2013 University of Cambridge, UK Applied Functional and Harmonic Analysis Group Head of Group Anders

More information

Multichannel Sampling of Signals with Finite. Rate of Innovation

Multichannel Sampling of Signals with Finite. Rate of Innovation Multichannel Sampling of Signals with Finite 1 Rate of Innovation Hojjat Akhondi Asl, Pier Luigi Dragotti and Loic Baboulaz EDICS: DSP-TFSR, DSP-SAMP. Abstract In this paper we present a possible extension

More information

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas M.Vidyasagar@utdallas.edu www.utdallas.edu/ m.vidyasagar

More information

Enhanced Compressive Sensing and More

Enhanced Compressive Sensing and More Enhanced Compressive Sensing and More Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Nonlinear Approximation Techniques Using L1 Texas A & M University

More information

Compressive sensing of low-complexity signals: theory, algorithms and extensions

Compressive sensing of low-complexity signals: theory, algorithms and extensions Compressive sensing of low-complexity signals: theory, algorithms and extensions Laurent Jacques March 7, 9, 1, 14, 16 and 18, 216 9h3-12h3 (incl. 3 ) Graduate School in Systems, Optimization, Control

More information

Approximate Kernel PCA with Random Features

Approximate Kernel PCA with Random Features Approximate Kernel PCA with Random Features (Computational vs. Statistical Tradeoff) Bharath K. Sriperumbudur Department of Statistics, Pennsylvania State University Journées de Statistique Paris May 28,

More information

Optimization-based sparse recovery: Compressed sensing vs. super-resolution

Optimization-based sparse recovery: Compressed sensing vs. super-resolution Optimization-based sparse recovery: Compressed sensing vs. super-resolution Carlos Fernandez-Granda, Google Computational Photography and Intelligent Cameras, IPAM 2/5/2014 This work was supported by a

More information

An iterative hard thresholding estimator for low rank matrix recovery

An iterative hard thresholding estimator for low rank matrix recovery An iterative hard thresholding estimator for low rank matrix recovery Alexandra Carpentier - based on a joint work with Arlene K.Y. Kim Statistical Laboratory, Department of Pure Mathematics and Mathematical

More information

Sparse Recovery Beyond Compressed Sensing

Sparse Recovery Beyond Compressed Sensing Sparse Recovery Beyond Compressed Sensing Carlos Fernandez-Granda www.cims.nyu.edu/~cfgranda Applied Math Colloquium, MIT 4/30/2018 Acknowledgements Project funded by NSF award DMS-1616340 Separable Nonlinear

More information

Compressed Sensing and Neural Networks

Compressed Sensing and Neural Networks and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications

More information

Breaking the coherence barrier - A new theory for compressed sensing

Breaking the coherence barrier - A new theory for compressed sensing Breaking the coherence barrier - A new theory for compressed sensing Anders C. Hansen (Cambridge) Joint work with: B. Adcock (Simon Fraser University) C. Poon (Cambridge) B. Roman (Cambridge) UCL-Duke

More information

Statistical Geometry Processing Winter Semester 2011/2012

Statistical Geometry Processing Winter Semester 2011/2012 Statistical Geometry Processing Winter Semester 2011/2012 Linear Algebra, Function Spaces & Inverse Problems Vector and Function Spaces 3 Vectors vectors are arrows in space classically: 2 or 3 dim. Euclidian

More information

2D X-Ray Tomographic Reconstruction From Few Projections

2D X-Ray Tomographic Reconstruction From Few Projections 2D X-Ray Tomographic Reconstruction From Few Projections Application of Compressed Sensing Theory CEA-LID, Thalès, UJF 6 octobre 2009 Introduction Plan 1 Introduction 2 Overview of Compressed Sensing Theory

More information

Bayesian Methods for Sparse Signal Recovery

Bayesian Methods for Sparse Signal Recovery Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Jason Palmer, Zhilin Zhang and Ritwik Giri Motivation Motivation Sparse Signal Recovery

More information

Towards a Mathematical Theory of Super-resolution

Towards a Mathematical Theory of Super-resolution Towards a Mathematical Theory of Super-resolution Carlos Fernandez-Granda www.stanford.edu/~cfgranda/ Information Theory Forum, Information Systems Laboratory, Stanford 10/18/2013 Acknowledgements This

More information

Minimizing Isotropic Total Variation without Subiterations

Minimizing Isotropic Total Variation without Subiterations MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Minimizing Isotropic Total Variation without Subiterations Kamilov, U. S. TR206-09 August 206 Abstract Total variation (TV) is one of the most

More information

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March

More information

Scale-Invariance of Support Vector Machines based on the Triangular Kernel. Abstract

Scale-Invariance of Support Vector Machines based on the Triangular Kernel. Abstract Scale-Invariance of Support Vector Machines based on the Triangular Kernel François Fleuret Hichem Sahbi IMEDIA Research Group INRIA Domaine de Voluceau 78150 Le Chesnay, France Abstract This paper focuses

More information

Gauge optimization and duality

Gauge optimization and duality 1 / 54 Gauge optimization and duality Junfeng Yang Department of Mathematics Nanjing University Joint with Shiqian Ma, CUHK September, 2015 2 / 54 Outline Introduction Duality Lagrange duality Fenchel

More information

Uniqueness Conditions for A Class of l 0 -Minimization Problems

Uniqueness Conditions for A Class of l 0 -Minimization Problems Uniqueness Conditions for A Class of l 0 -Minimization Problems Chunlei Xu and Yun-Bin Zhao October, 03, Revised January 04 Abstract. We consider a class of l 0 -minimization problems, which is to search

More information

Reproducing Kernel Hilbert Spaces

Reproducing Kernel Hilbert Spaces Reproducing Kernel Hilbert Spaces Lorenzo Rosasco 9.520 Class 03 February 11, 2009 About this class Goal To introduce a particularly useful family of hypothesis spaces called Reproducing Kernel Hilbert

More information

3. Some tools for the analysis of sequential strategies based on a Gaussian process prior

3. Some tools for the analysis of sequential strategies based on a Gaussian process prior 3. Some tools for the analysis of sequential strategies based on a Gaussian process prior E. Vazquez Computer experiments June 21-22, 2010, Paris 21 / 34 Function approximation with a Gaussian prior Aim:

More information

INDUSTRIAL MATHEMATICS INSTITUTE. B.S. Kashin and V.N. Temlyakov. IMI Preprint Series. Department of Mathematics University of South Carolina

INDUSTRIAL MATHEMATICS INSTITUTE. B.S. Kashin and V.N. Temlyakov. IMI Preprint Series. Department of Mathematics University of South Carolina INDUSTRIAL MATHEMATICS INSTITUTE 2007:08 A remark on compressed sensing B.S. Kashin and V.N. Temlyakov IMI Preprint Series Department of Mathematics University of South Carolina A remark on compressed

More information

Sparse stochastic processes

Sparse stochastic processes Sparse stochastic processes Part II: Functional characterization of sparse stochastic processes Prof. Michael Unser, LIB Ecole CIMPA, Analyse et Probabilité, Abidjan 204 oad map for theory of sparse processes

More information

Lecture 22: More On Compressed Sensing

Lecture 22: More On Compressed Sensing Lecture 22: More On Compressed Sensing Scribed by Eric Lee, Chengrun Yang, and Sebastian Ament Nov. 2, 207 Recap and Introduction Basis pursuit was the method of recovering the sparsest solution to an

More information

Sparse signals recovered by non-convex penalty in quasi-linear systems

Sparse signals recovered by non-convex penalty in quasi-linear systems Cui et al. Journal of Inequalities and Applications 018) 018:59 https://doi.org/10.1186/s13660-018-165-8 R E S E A R C H Open Access Sparse signals recovered by non-conve penalty in quasi-linear systems

More information

Optimized Compact-support Interpolation Kernels

Optimized Compact-support Interpolation Kernels Optimized Compact-support Interpolation Kernels Ramtin Madani, Student Member, IEEE, Ali Ayremlou, Student Member, IEEE, Arash Amini, Farrokh Marvasti, Senior Member, IEEE, Abstract In this paper, we investigate

More information

Approximate Kernel Methods

Approximate Kernel Methods Lecture 3 Approximate Kernel Methods Bharath K. Sriperumbudur Department of Statistics, Pennsylvania State University Machine Learning Summer School Tübingen, 207 Outline Motivating example Ridge regression

More information

Compressed sensing for radio interferometry: spread spectrum imaging techniques

Compressed sensing for radio interferometry: spread spectrum imaging techniques Compressed sensing for radio interferometry: spread spectrum imaging techniques Y. Wiaux a,b, G. Puy a, Y. Boursier a and P. Vandergheynst a a Institute of Electrical Engineering, Ecole Polytechnique Fédérale

More information

Least Squares with Examples in Signal Processing 1. 2 Overdetermined equations. 1 Notation. The sum of squares of x is denoted by x 2 2, i.e.

Least Squares with Examples in Signal Processing 1. 2 Overdetermined equations. 1 Notation. The sum of squares of x is denoted by x 2 2, i.e. Least Squares with Eamples in Signal Processing Ivan Selesnick March 7, 3 NYU-Poly These notes address (approimate) solutions to linear equations by least squares We deal with the easy case wherein the

More information

Kernels for Multi task Learning

Kernels for Multi task Learning Kernels for Multi task Learning Charles A Micchelli Department of Mathematics and Statistics State University of New York, The University at Albany 1400 Washington Avenue, Albany, NY, 12222, USA Massimiliano

More information

c 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE

c 2011 International Press Vol. 18, No. 1, pp , March DENNIS TREDE METHODS AND APPLICATIONS OF ANALYSIS. c 2011 International Press Vol. 18, No. 1, pp. 105 110, March 2011 007 EXACT SUPPORT RECOVERY FOR LINEAR INVERSE PROBLEMS WITH SPARSITY CONSTRAINTS DENNIS TREDE Abstract.

More information

Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization

Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization Shuyang Ling Department of Mathematics, UC Davis Oct.18th, 2016 Shuyang Ling (UC Davis) 16w5136, Oaxaca, Mexico Oct.18th, 2016

More information

Approximate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery

Approximate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery Approimate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery arxiv:1606.00901v1 [cs.it] Jun 016 Shuai Huang, Trac D. Tran Department of Electrical and Computer Engineering Johns

More information

The Learning Problem and Regularization Class 03, 11 February 2004 Tomaso Poggio and Sayan Mukherjee

The Learning Problem and Regularization Class 03, 11 February 2004 Tomaso Poggio and Sayan Mukherjee The Learning Problem and Regularization 9.520 Class 03, 11 February 2004 Tomaso Poggio and Sayan Mukherjee About this class Goal To introduce a particularly useful family of hypothesis spaces called Reproducing

More information

Sampling Signals from a Union of Subspaces

Sampling Signals from a Union of Subspaces 1 Sampling Signals from a Union of Subspaces Yue M. Lu and Minh N. Do I. INTRODUCTION Our entire digital revolution depends on the sampling process, which converts continuousdomain real-life signals to

More information

Likelihood Bounds for Constrained Estimation with Uncertainty

Likelihood Bounds for Constrained Estimation with Uncertainty Proceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference 5 Seville, Spain, December -5, 5 WeC4. Likelihood Bounds for Constrained Estimation with Uncertainty

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011 Compressed Sensing Huichao Xue CS3750 Fall 2011 Table of Contents Introduction From News Reports Abstract Definition How it works A review of L 1 norm The Algorithm Backgrounds for underdetermined linear

More information

Geometric Modeling Summer Semester 2010 Mathematical Tools (1)

Geometric Modeling Summer Semester 2010 Mathematical Tools (1) Geometric Modeling Summer Semester 2010 Mathematical Tools (1) Recap: Linear Algebra Today... Topics: Mathematical Background Linear algebra Analysis & differential geometry Numerical techniques Geometric

More information

Digital Object Identifier /MSP

Digital Object Identifier /MSP DIGITAL VISION Sampling Signals from a Union of Subspaces [A new perspective for the extension of this theory] [ Yue M. Lu and Minh N. Do ] Our entire digital revolution depends on the sampling process,

More information

Kernel Learning via Random Fourier Representations

Kernel Learning via Random Fourier Representations Kernel Learning via Random Fourier Representations L. Law, M. Mider, X. Miscouridou, S. Ip, A. Wang Module 5: Machine Learning L. Law, M. Mider, X. Miscouridou, S. Ip, A. Wang Kernel Learning via Random

More information

Convergence Rates of Kernel Quadrature Rules

Convergence Rates of Kernel Quadrature Rules Convergence Rates of Kernel Quadrature Rules Francis Bach INRIA - Ecole Normale Supérieure, Paris, France ÉCOLE NORMALE SUPÉRIEURE NIPS workshop on probabilistic integration - Dec. 2015 Outline Introduction

More information

A NEW ITERATIVE METHOD FOR THE SPLIT COMMON FIXED POINT PROBLEM IN HILBERT SPACES. Fenghui Wang

A NEW ITERATIVE METHOD FOR THE SPLIT COMMON FIXED POINT PROBLEM IN HILBERT SPACES. Fenghui Wang A NEW ITERATIVE METHOD FOR THE SPLIT COMMON FIXED POINT PROBLEM IN HILBERT SPACES Fenghui Wang Department of Mathematics, Luoyang Normal University, Luoyang 470, P.R. China E-mail: wfenghui@63.com ABSTRACT.

More information

Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London

Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation Pier Luigi Dragotti Imperial College London Outline Part 1: Sparse Signal Representation ~90min Part 2: Sparse Sampling ~90min 2

More information

Compressed Sensing and Linear Codes over Real Numbers

Compressed Sensing and Linear Codes over Real Numbers Compressed Sensing and Linear Codes over Real Numbers Henry D. Pfister (joint with Fan Zhang) Texas A&M University College Station Information Theory and Applications Workshop UC San Diego January 31st,

More information

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016 Lecture notes 5 February 9, 016 1 Introduction Random projections Random projections are a useful tool in the analysis and processing of high-dimensional data. We will analyze two applications that use

More information

Reproducing Kernel Hilbert Spaces

Reproducing Kernel Hilbert Spaces 9.520: Statistical Learning Theory and Applications February 10th, 2010 Reproducing Kernel Hilbert Spaces Lecturer: Lorenzo Rosasco Scribe: Greg Durrett 1 Introduction In the previous two lectures, we

More information

Sparse linear models

Sparse linear models Sparse linear models Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 2/22/2016 Introduction Linear transforms Frequency representation Short-time

More information

Statistical Convergence of Kernel CCA

Statistical Convergence of Kernel CCA Statistical Convergence of Kernel CCA Kenji Fukumizu Institute of Statistical Mathematics Tokyo 106-8569 Japan fukumizu@ism.ac.jp Francis R. Bach Centre de Morphologie Mathematique Ecole des Mines de Paris,

More information

Compressed Sensing and Related Learning Problems

Compressed Sensing and Related Learning Problems Compressed Sensing and Related Learning Problems Yingzhen Li Dept. of Mathematics, Sun Yat-sen University Advisor: Prof. Haizhang Zhang Advisor: Prof. Haizhang Zhang 1 / Overview Overview Background Compressed

More information

Version beta 1.6 (December 6, 2013). This is a preproduction manuscript of a work protected by copyright. No reproduction of any part may take place

Version beta 1.6 (December 6, 2013). This is a preproduction manuscript of a work protected by copyright. No reproduction of any part may take place Version beta 1.6 (December 6, 2013). This is a preproduction manuscript of a work protected by copyright. No reproduction of any part may take place without the written permission of Cambridge University

More information

AN ITERATIVE SURE-LET APPROACH TO SPARSE RECONSTRUCTION

AN ITERATIVE SURE-LET APPROACH TO SPARSE RECONSTRUCTION AN ITERATIVE SURE-LET APPROACH TO SPARSE RECONSTRUCTION Feng Xue a, Thierry Blu b, Runle Du a and Jiaqi Liu a a National Key Laboratory of Science and Technology on Test Physics and Numerical athematics,

More information

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional

More information

Fast Hard Thresholding with Nesterov s Gradient Method

Fast Hard Thresholding with Nesterov s Gradient Method Fast Hard Thresholding with Nesterov s Gradient Method Volkan Cevher Idiap Research Institute Ecole Polytechnique Federale de ausanne volkan.cevher@epfl.ch Sina Jafarpour Department of Computer Science

More information

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France. Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Overview of the course Introduction sparsity & data compression inverse problems

More information

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation UIUC CSL Mar. 24 Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation Yuejie Chi Department of ECE and BMI Ohio State University Joint work with Yuxin Chen (Stanford).

More information

Summary of the ISBI 2013 Grand Challenge on 3D Deconvolution Microscopy

Summary of the ISBI 2013 Grand Challenge on 3D Deconvolution Microscopy Summary of the ISBI 23 Grand Challenge on 3D Deconvolution Microscopy Cédric Vonesch & Stamatios Lefkimmiatis Biomedical Imaging Group EPFL, Lausanne, Switzerland April 7, 23 Thanks to our sponsors! ISBI

More information

Compressive Sensing and Beyond

Compressive Sensing and Beyond Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered

More information

SPARSE signal representations have gained popularity in recent

SPARSE signal representations have gained popularity in recent 6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying

More information