Sparse modeling and the resolution of inverse problems in biomedical imaging

Size: px
Start display at page:

Download "Sparse modeling and the resolution of inverse problems in biomedical imaging"

Transcription

1 Sparse modeling and the resolution of inverse problems in biomedical imaging Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Plenary talk, IEEE Int. Symp. Biomedical Imaging (ISBI 5), 6-9 April, 205, New York, USA IEEE International Symposium on Biomedical Imaging: Macro to Nano 7-0 July 2002 Washington, DC, USA ISBI INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING Logo design: Annette Unser 2

2 Variational formulation of image reconstruction Linear forward model H noise g = Hs + n s Integral operator Ill-posed inverse problem: recover s from noisy measurements g Reconstruction as an optimization problem s =argmin g Hs 2 2 }{{} + λr(s) }{{} data consistency regularization n 3 Classical reconstruction = linear algorithm Quadratic regularization (Tikhonov) R(s) = Ls 2 s =argmin g Hs 2 2 }{{} + λr(s) }{{} data consistency regularization Formal linear solution: s =(H T H + λl T L) H T g = R λ g L = C /2 s : Whitening filter Statistical formulation under Gaussian hypothesis Wiener (LMMSE) solution = Gauss MMSE = Gauss MAP s MAP =argmin s σ 2 g Hs 2 2 }{{} Data Log likelihood + C /2 s s 2 2 }{{} Gaussian prior likelihood Signal covariance: C s = E{s s T } 4

3 Current trend: non-linear algorithms (l optimization) s =argmin y Hs 2 2 }{{} + λr(s) }{{} data consistency regularization Wavelet-domain regularization Wavelet expansion: s = Wv (typically, sparse) Wavelet-domain sparsity-constraint: R(s) = v l with v = W s (Nowak et al., Daubechies et al. 2004) l regularization (Total variation=tv) (Rudin-Osher, 992) R(s) = Ls l with L: gradient Compressed sensing/sampling (Candes-Romberg-Tao; Donoho, 2006) 5 Key research questions (for biomedical imaging) Formulation of ill-posed reconstruction problem Statistical modeling (beyond Gaussian) supporting non-linear reconstruction schemes (including CS) Sparse stochastic processes 2 Efficient implementation for large-scale imaging problem ADMM = smart chaining of simple modules 3 Future trends and open issues 6

4 OUTLINE Variational formulation of inverse problems Statistical modeling Introduction to sparse stochastic processes Generalized innovation model Statistical characterization of signal Algorithm design Reconstruction of biomedical images Discretization of inverse problem Generic MAP estimator (iterative reconstruction algorithm) Applications Deconvolution microscopy Computed tomography Cryo-electron tomography Differential phase-contrast tomography 7 An introduction to sparse stochastic processes EDEE Course 8

5 Random spline: archetype of sparse signal non-uniform spline of degree Ds(t) = n a n δ(t t n )=w(t) Random weights {a n } i.i.d. and random knots {t n } (Poisson with rate λ) Anti-derivative operators Shift-invariant solution: D ϕ(t) =( + ϕ)(t) = Scale-invariant solution: D 0 ϕ(t) = t 0 ϕ(τ)dτ t ϕ(τ)dτ 9 Compound Poisson process Stochastic differential equation Ds(t) =w(t) with boundary condition s(0) = 0 Innovation: w(t) = n a n δ(t t n ) Formal solution s(t) =D w(t) = n = n a n D {δ( t n )}(t) a n + (t t n ) 0

6 Lévy processes: all admissible brands of innovations Generalized innovations : white Lévy noise with E{w(t)w(t )} = σ 2 wδ(t t ) Ds = w (perfect decoupling!) (Wiener 923) (Paul Lévy circa 930) Generalized innovation model Theoretical framework: Gelfand s theory of generalized stochastic processes Generic test function ϕ Splays the role of index variable innovation process X = ϕ, w L White noise 3 w Solution of SDE (general operator) 2 Whitening operator L s =L w sparse stochastic process Y = ϕ, s = ϕ, L w = L ϕ, w 4 Approximate decoupling Proper definition of continuous-domain white noise (Unser et al, IEEE-IT 204) Regularization operator vs. wavelet analysis Main feature: inherent sparsity (few significant coefficients) 2

7 Infinite divisibility and Lévy exponents Definition: A random variable X with generic pdf p id (x) is infinitely divisible (id) iff., for any N Z +, there exist i.i.d. random variables X,...,X N such that X = d X + +X N. Rectangular test function i.i.d. X = w, rect =, =, + +, Proposition The random variable X = w, rect where w is a generalized innovation process is infinitely divisible. It is uniquely characterized by its Lévy exponent f(ω) =logˆp id (ω). p id (ω) =e f(ω) = R Bottom line: There is a one-to-one correspondence between Lévy exponents and infinitely divisible distributions and, by extension, innovation processes. n p id (x)e jωx dx n 3 Probability laws of innovations are infinite divisible X = w, ϕ =, lim n, = lim n, + +, Statistical description of white Lévy noise w (innovation) Characterized by canonical (p-admissible) Lévy exponent f(ω) Generic observation: X = ϕ, w with ϕ L p (R d ) X is infinitely divisible with (modified) Lévy exponent f ϕ (ω) =log p X (ω) = f ( ωϕ(x) ) dx R d 4

8 Sparser Probability laws of sparse processes are id Analysis: go back to innovation process: w =Ls Generic random observation: X = ϕ,w with ϕ S(R d ) or ϕ L p (R d ) (by extension) {}}{ Linear functional: Y = ψ,s = ψ, L w = L ψ,w If φ =L ψ L p (R d ) then Y = ψ, s = φ,w is infinitely divisible with Lévy exponent f φ (ω) = R d f ( ωφ(x) ) dx p Y (y) =F {e fφ(ω) }(y) = R e f φ(ω) jωy dω 2π = explicit form of pdf Unser and Tafti An Introduction to Sparse Stochastic Processes 5 Examples of infinitely divisible laws p id (x) (a) Gaussian p Gauss (x) = x2 e 2σ 2 2πσ (b) Laplace p Laplace (x) = λ 2 e λ x (c) Compound Poisson p Poisson (x) =F {e λ(ˆp A(ω) ) } (d) Cauchy (stable) p Cauchy (x) = π (x 2 +) Characteristic function: p id (ω) = R p id (x)e jωx dx =e f(ω) 6

9 Sparser Examples of id noise distributions p id (x) Observations: X n = w, rect( n) (a) Gaussian f(ω) = σ2 0 2 ω (b) Laplace ( f(ω) =log ) +ω (c) Compound Poisson f(ω) =λ R (ejxω )p(x)dx (d) Cauchy (stable) f(ω) = s 0 ω Complete mathematical characterization: ( Pw (ϕ) =exp f ( ϕ(x) ) ) dx R d 7 Generation of self-similar processes: s =L w L F (jω) H+ 2 L : fractional integrator fbm; H = 0.50 Poisson; H = 0.50 fbm; H = 0.75 fbm; H =.25 fbm; H =.50 H=.5 H=.75 H=.25 H=.5 Poisson; H = 0.75 Poisson; H =.25 Poisson; H =.50 Gaussian Sparse (generalized Poisson) Fractional Brownian motion (Mandelbrot, 968) (U.-Tafti, IEEE-SP 200) 8

10 Aesthetic sparse signal: the Mondrian process L=D x D y F (jω x )(jω y ) λ =30 9 Scale- and rotation-invariant processes Stochastic partial differential equation : ( Δ) H+ 2 s(x) =w(x) Gaussian H=.5 H=.75 H=.25 H=.75 Sparse (generalized Poisson) (U.-Tafti, IEEE-SP 200) 20

11 Powers of ten: from astronomy to biology 2 RECONSTRUCTION OF BIOMEDICAL IMAGES Discretization of reconstruction problem Signal reconstruction algorithm (MAP) Examples of image reconstruction Deconvolution microscopy X-ray tomography Cryo-electron tomography Phase contrast tomography 22

12 Discretization of reconstruction problem Spline-like reconstruction model: s(r) = k Ω s[k]β k (r) s =(s[k]) k Ω Innovation model Ls = w s = L w Discretization u = Ls (matrix notation) p U is part of infinitely divisible family Physical model: image formation and acquisition y m = s (x)η m (x)dx + n[m] = s,η m + n[m], (m =,...,M) R d y = y 0 + n = Hs + n n: i.i.d. noise with pdf p N [H] m,k = η m,β k = η m (r)β k (r)dr: R d (M K) system matrix 23 Posterior probability distribution p S Y (s y) = p Y S(y s)p S (s) p Y (y) = p ( ) N y Hs ps (s) p Y (y) (Bayes rule) = Z p N (y Hs)p S (s) u = Ls p S (s) p U (Ls) k Ω p ( ) U [Ls]k Additive white Gaussian noise scenario (AWGN) ) y Hs 2 ( ) p S Y (s y) exp ( 2σ 2 p U [Ls]k k Ω... and then take the log and maximize... 24

13 Sparser General form of MAP estimator s MAP =argmin ( 2 y Hs ) σ2 n Φ U ([Ls] n ) Gaussian: p U (x) = 2πσ0 e x2 /(2σ 2 0 ) Φ U (x) = 2σ 2 0 x 2 + C Laplace: p U (x) = λ 2 e λ x Φ U (x) =λ x + C 2 Student: p U (x) = B ( ) r, 2 ( ) r+ 2 x 2 + Φ U (x) = ( r + 2) log( + x 2 )+C 3 Potential: Φ U (x) = log p U (x) Proximal operator: pointwise denoiser 3 5 σ 2 Φ U (u) linear attenuation soft-threshold l 2 minimization l minimization shrinkage function l p relaxation for p 0 26

14 Maximum a posteriori (MAP) estimation Constrained optimization formulation Auxiliary innovation variable: u = Ls L A (s, u, α) = 2 g Hs σ2 n Φ U ([u] n )+α T (Ls u)+ μ 2 Ls u Alternating direction method of multipliers (ADMM) L A (s, u, α) = 2 g Hs σ2 n Φ U ([u] n )+α T (Ls u)+ μ 2 Ls u 2 2 Sequential minimization s k+ arg min s R N L A (s, u k, α k ) α k+ = α k + μ ( Ls k+ u k) Linear inverse problem: Nonlinear denoising: s k+ = ( H T H + μl T L ) ( H T y + z k+) ( ) u k+ = prox ΦU Ls k+ + μ αk+ ; σ2 μ with z k+ = L T ( μu k α k) 3 2 Proximal operator taylored to stochastic model prox ΦU (y; λ) =argmin u 2 y u 2 + λφ U (u)

15 Deconvolution of fluorescence micrographs Physical model of a diffraction-limited microscope g(x, y, z) =(h 3D s)(x, y, z) 3-D point spread function (PSF) h 3D (x, y, z) =I 0 p λ ( x M, y M, z M 2 ) 2 p λ (x, y, z) = P (ω,ω 2 )exp R 2 ( j2πz ω2 + ω2 2 ) ( 2λf0 2 exp j2π xω + yω 2 λf 0 ) dω dω 2 Optical parameters λ: wavelength (emission) M: magnification factor f 0 : focal length P (ω,ω 2 )= ω <R0 : pupil function NA = n sin θ = R 0 /f 0 : numerical aperture 29 Deconvolution: numerical set-up Discretization ω 0 π and representation in (separable) sinc basis {sinc(x k)} k Z d Analysis functions: η m (x, y, z) =h 3D (x m,y m 2,z m 3 ) [H] m,k = η m, sinc( k) = h 3D ( m), sinc( k) = ( sinc h 3D ) (m k) =h3d (m k). H and L: convolution matrices diagonalized by discrete Fourier transform Linear step of ADMM algorithm implemented using the FFT s k+ = ( H T H + μl T L ) ( H T y + z k+) with z k+ = L T ( μu k α k) 30

16 2D deconvolution experiment Astrocytes cells bovine pulmonary artery cells human embryonic stem cells L : gradient Deconvolution results in db Optimized parameters Gaussian Estimator Laplace Estimator Student s Estimator Astrocytes cells Pulmonary cells Stem cells D deconvolution with sparsity constraints Maximum intensity projections of image stacks; Leica DM 5500 widefield epifluorescence microscope with a 63 oil-immersion objective; C. Elegans embryo labeled with Hoechst, Alexa488, Alexa568; (Vonesch-U. IEEE Trans. Im. Proc. 2009) 32

17 Computed tomography (straight rays) Projection geometry: x = tθ + rθ with θ =(cosθ, sin θ) Radon transform (line integrals) R θ {s(x)}(t) = s(tθ + rθ )dr R = s(x)δ(t x, θ )dx R 2 R θ {s}(t) y r θ x t sinogram (b) ( ) θ Equivalent analysis functions: η m (x) =δ ( t m x, θ m ) 33 Computed tomography reconstruction results (a) (b) Figure 0.6 Images used in X-ray tomographic reconstruction experiments. (a) The Shepp-Logan (SL) phantom. (b) Cross section of a human lung. Table 0.4 Reconstruction results of X-ray computed tomography using different estimators. Directions Estimation performance (SNR in db) Gaussian Laplace Student s SL Phantom SL Phantom Lung Lung L: discrete gradient 34

18 EM: Single particle analysis Cryo-electron micrograph Bovine papillomavirus C.-O. Sorzano 35 Image reconstruction (real data) Standard Fourier-based reconstruction High-resolution Fourier-based reconstruction High-resolution reconstruction with sparsity 36

19 Differential phase-contrast tomography x 2 θ Paul Scherrer Institute (PSI), Villigen x g CCD x g (y, θ) θ x interference pattern Intensity (Pfeiffer, Nature 2006) Mathematical model g(t, θ) = t R θ{f}(t) g = Hs [H] (i,j),k = t P θ j β k (t j ) 37 Experimental results Rat brain reconstruction with 72 projections ADMM-PCG (TV) FBP (Pfeiffer et al. Nucl. Inst. Meth. Phys. Res. 2007) 2 3 L: discrete gradient, Φ(u) =λ u Collaboration: Prof. Marco Stampanoni, TOMCAT PSI / ETHZ 38

20 Reducing the numbers of views Rat brain reconstruction with 8 projections ADMM-PCG g-fbp SSIM =.96 SSIM =.5 SSIM =.95 SSIM =.60 SSIM =.49 SSIM =.5 SSIM =.89 SSIM =.43 Collaboration: Prof. Marco Stampanoni, TOMCAT PSI / ETHZ (Nichian et al. Optics Express 203) 39 Performance evaluation Goldstandard: high-quality iterative reconstruction with 72 views (a) (b) Reduction of acquisition time by a factor 0 (or more)? 40

21 CAN WE GO BEYOND MAP ESTIMATION? A detailed investigation of simpler denoising problem Test case: Lévy processes Gaussian : MAP = Wiener = MMSE Poisson: MAP = zero. Can we compute the best = MMSE estimator? Yes, by using belief propagation (Kamilov et al., IEEE-SP 203) 2. Can we compute it with an iterative MAP-type algorithm? Yes (with the help of Haar wavelets) by optimizing the thresholding function 4 Pointwise MMSE estimators for AWGN Minimum-mean-square-error (MMSE) estimator from y = x + n x MMSE (y) =E{X Y = y} = x p X Y (x y)dx R AWGN probability model = p Y X (y x) =g σ (y x) and p Y = g σ p X Stein s formula for AWGN g σ : Gaussian pdf (zero mean with variance σ 2 ) x MMSE (y) =y σ 2 Φ Y (y) where Φ Y (y) = d dy log p Y (y) = p Y (y) p Y (y) 3 2 MMSE LMMSE 0 MAP = soft threshold Laplacian 42

22 Iterative wavelet-based denoising: MAP MMSE Consistent Cycle Spinning (CCS) (Kamilov, IEEE-SPL 202) Algorithm3: CCSdenoisingsolves { Problem (.4) where A is a tight-frame matrix y τ M Φ(As)} where A is a tight frame CCS denoising: Solves min s 2 s input: y,s 0 R N,τ,μ R + set: k = 0,λ 0 = 0,u = Ay; repeat z k+ ( = prox ( Φ +μ u + μas k + λ k) ; s k+ = A ( z k+ μ λk) λ k+ = λ k μ ( z k+ As k+) k = k + until stopping criterion return s = s k τ +μ ) or z k+ = v MMSE ( z k+ ) σ ; 2 +μ CCS constraint: z = As with z 2 = M s 2 (enforces energy convervation) Variation on a theme: substitute MAP shrinkage by MMSE shrinkage (Kazerouni, IEEE-SPL 203) Iterative MMSE denoising 43 Comparison of wavelet denoising strategies Key empirical finding: CCS MMSE denoising yields optimal solution!!!! Unser and Tafti An Introduction to Sparse Stochastic Processes Chap. 44

23 CONCLUSION Unifying continuous-domain stochastic model Backward compatibility with classical Gaussian theory Operator-based formulation: Lévy-driven SDEs or SPDEs Gaussian vs. sparse (generalized Poisson, student, SαS) Regularization Sparsification via operator-like behavior (whitening) Specific family of id potential functions (typ., non-convex) Conceptual framework for sparse signal recovery Principled approach for the development of novel algorithms Challenge: algorithms for solving large-scale problems in imaging: Cryo-electron tomography, diffraction tomography, dynamic MRI (3D + time), etc... Beyond MAP reconstruction: MMSE with learning (self-tuning) 45 Conclusion (Cont d) The continuous-domain theory of sparse stochastic processes is compatible with both 20th century SP = linear and Fourier-based algorithms, and 2st century SP = non-linear, sparsitypromoting, wavelet-based algorithms... but there are still many open questions... 46

24 References Theory of sparse stochastic processes M. Unser and P. Tafti, An Introduction to Sparse Stochastic Processes, Cambridge University Press, 204; preprint, available at M. Unser, P.D. Tafti, Stochastic models for sparse and piecewise-smooth signals, IEEE Trans. Signal Processing, vol. 59, no. 3, pp , March 20. M. Unser, P. Tafti, and Q. Sun, A unified formulation of Gaussian vs. sparse stochastic processes Part I: Continuous-domain theory, IEEE Trans. Information Theory, vol. 60, no. 3, pp , March 204. Algorithms and imaging applications E. Bostan, U.S. Kamilov, M. Nilchian, M. Unser, Sparse Stochastic Processes and Discretization of Linear Inverse Problems, IEEE Trans. Image Processing, vol. 22, no. 7, pp , 203. C. Vonesch, M. Unser, A Fast Multilevel Algorithm for Wavelet-Regularized Image Restoration, IEEE Trans. Image Processing, vol. 8, no. 3, pp , March M. Nilchian, C. Vonesch, S. Lefkimmiatis, P. Modregger, M. Stampanoni, M. Unser, Constrained Regularized Reconstruction of X-Ray-DPCI Tomograms with Weighted-Norm, Optics Express, vol. 2, no. 26, pp , 203. A. Entezari, M. Nilchian, M. Unser, A Box Spline Calculus for the Discretization of Computed Tomography Reconstruction Problems, IEEE Trans. Medical Imaging, vol. 3, no. 8, pp , Acknowledgments Many thanks to Dr. Pouya Tafti Prof. Qiyu Sun Prof. Arash Amini Dr. Cédric Vonesch Dr. Matthieu Guerquin-Kern Emrah Bostan Ulugbek Kamilov Masih Nilchian Members of EPFL s Biomedical Imaging Group Preprints and demos: 48

25 ebook: web preprint An Introduction to Sparse Stochastic Processes Michael Unser and Pouya D. Tafti Cover design: Annette Unser 49

Sparse stochastic processes: A statistical framework for modern signal processing

Sparse stochastic processes: A statistical framework for modern signal processing Sparse stochastic processes: A statistical framework for modern signal processing Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Plenary talk, Int. Conf. Syst. Sig. Im. Proc. (IWSSIP),

More information

Sparse stochastic processes with application to biomedical imaging

Sparse stochastic processes with application to biomedical imaging Sparse stochastic processes with application to biomedical imaging Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Plenary talk, ICIAR, October -, Vilamoura, Algarve, Portugal Variational

More information

Sparse stochastic processes

Sparse stochastic processes BIOMEDICAL IMAGING GROUP (BIG) LABORATOIRE D IMAGERIE BIOMEDICALE EPFL LIB Bât. BM 4.127 CH 115 Lausanne Switzerland Téléphone : Fax : E-mail : Site web : + 4121 693 51 5 + 4121 693 37 1 Michael.Unser@epfl.ch

More information

The Analog Formulation of Sparsity Implies Infinite Divisibility and Rules Out Bernoulli-Gaussian Priors

The Analog Formulation of Sparsity Implies Infinite Divisibility and Rules Out Bernoulli-Gaussian Priors The Analog Formulation of Sparsity Implies Infinite Divisibility and ules Out Bernoulli-Gaussian Priors Arash Amini, Ulugbek S. Kamilov, and Michael Unser Biomedical Imaging Group BIG) École polytechnique

More information

Minimizing Isotropic Total Variation without Subiterations

Minimizing Isotropic Total Variation without Subiterations MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Minimizing Isotropic Total Variation without Subiterations Kamilov, U. S. TR206-09 August 206 Abstract Total variation (TV) is one of the most

More information

Learning MMSE Optimal Thresholds for FISTA

Learning MMSE Optimal Thresholds for FISTA MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Learning MMSE Optimal Thresholds for FISTA Kamilov, U.; Mansour, H. TR2016-111 August 2016 Abstract Fast iterative shrinkage/thresholding algorithm

More information

MMSE Denoising of 2-D Signals Using Consistent Cycle Spinning Algorithm

MMSE Denoising of 2-D Signals Using Consistent Cycle Spinning Algorithm Denoising of 2-D Signals Using Consistent Cycle Spinning Algorithm Bodduluri Asha, B. Leela kumari Abstract: It is well known that in a real world signals do not exist without noise, which may be negligible

More information

A Unified Formulation of Gaussian Versus Sparse Stochastic Processes

A Unified Formulation of Gaussian Versus Sparse Stochastic Processes A Unified Formulation of Gaussian Versus Sparse Stochastic Processes Michael Unser, Pouya Tafti and Qiyu Sun EPFL, UCF Appears in IEEE Trans. on Information Theory, 2014 Presented by Liming Wang M. Unser,

More information

STATISTICAL inference is a central theme to the theory. Learning Convex Regularizers for Optimal Bayesian Denoising

STATISTICAL inference is a central theme to the theory. Learning Convex Regularizers for Optimal Bayesian Denoising Learning Convex Regularizers for Optimal Bayesian Denoising Ha Q. Nguyen, Emrah Bostan, Member, IEEE, and Michael Unser, Fellow, IEEE arxiv:7.9v [cs.lg] 6 May 7 Abstract We propose a data-driven algorithm

More information

The rise and fall of regularizations: l1 vs. l2 representer theorems

The rise and fall of regularizations: l1 vs. l2 representer theorems The rise and fall of regularizations: l1 vs. l2 representer theorems Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Joint work with Julien Fageot, John-Paul Ward, and Harshit Gupta

More information

New representer theorems: From compressed sensing to deep learning

New representer theorems: From compressed sensing to deep learning New representer theorems: From compressed sensing to deep learning Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Joint work with Julien Fageot, John-Paul Ward and Kyong Jin Mathematisches

More information

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March

More information

Version beta 1.6 (December 6, 2013). This is a preproduction manuscript of a work protected by copyright. No reproduction of any part may take place

Version beta 1.6 (December 6, 2013). This is a preproduction manuscript of a work protected by copyright. No reproduction of any part may take place Version beta 1.6 (December 6, 2013). This is a preproduction manuscript of a work protected by copyright. No reproduction of any part may take place without the written permission of Cambridge University

More information

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6)

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement to the material discussed in

More information

Sparse stochastic processes

Sparse stochastic processes Sparse stochastic processes Part II: Functional characterization of sparse stochastic processes Prof. Michael Unser, LIB Ecole CIMPA, Analyse et Probabilité, Abidjan 204 oad map for theory of sparse processes

More information

Stochastic Models for Sparse and Piecewise-Smooth Signals

Stochastic Models for Sparse and Piecewise-Smooth Signals IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 59, NO. 3, MARCH 2011 989 Stochastic Models for Sparse and Piecewise-Smooth Signals Michael Unser, Fellow, IEEE, and Pouya Dehghani Tafti, Member, IEEE Abstract

More information

COMPRESSIVE OPTICAL DEFLECTOMETRIC TOMOGRAPHY

COMPRESSIVE OPTICAL DEFLECTOMETRIC TOMOGRAPHY COMPRESSIVE OPTICAL DEFLECTOMETRIC TOMOGRAPHY Adriana González ICTEAM/UCL March 26th, 214 1 ISPGroup - ICTEAM - UCL http://sites.uclouvain.be/ispgroup Université catholique de Louvain, Louvain-la-Neuve,

More information

MMSEEstimationofSparseLévyProcesses

MMSEEstimationofSparseLévyProcesses IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 61, NO. 1, JANUARY 1, 2013 137 MMSEEstimationofSparseLévyProcesses Ulugbek S. Kamilov, Student Member, IEEE, Pedram Pad, Student Member, IEEE, Arash Amini,

More information

2D X-Ray Tomographic Reconstruction From Few Projections

2D X-Ray Tomographic Reconstruction From Few Projections 2D X-Ray Tomographic Reconstruction From Few Projections Application of Compressed Sensing Theory CEA-LID, Thalès, UJF 6 octobre 2009 Introduction Plan 1 Introduction 2 Overview of Compressed Sensing Theory

More information

Enhanced Compressive Sensing and More

Enhanced Compressive Sensing and More Enhanced Compressive Sensing and More Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Nonlinear Approximation Techniques Using L1 Texas A & M University

More information

Inverse problem and optimization

Inverse problem and optimization Inverse problem and optimization Laurent Condat, Nelly Pustelnik CNRS, Gipsa-lab CNRS, Laboratoire de Physique de l ENS de Lyon Decembre, 15th 2016 Inverse problem and optimization 2/36 Plan 1. Examples

More information

Wavelets and differential operators: from fractals to Marr!s primal sketch

Wavelets and differential operators: from fractals to Marr!s primal sketch Wavelets and differential operators: from fractals to Marrs primal sketch Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Joint work with Pouya Tafti and Dimitri Van De Ville Plenary

More information

Compressive Sensing (CS)

Compressive Sensing (CS) Compressive Sensing (CS) Luminita Vese & Ming Yan lvese@math.ucla.edu yanm@math.ucla.edu Department of Mathematics University of California, Los Angeles The UCLA Advanced Neuroimaging Summer Program (2014)

More information

Sparsity Regularization

Sparsity Regularization Sparsity Regularization Bangti Jin Course Inverse Problems & Imaging 1 / 41 Outline 1 Motivation: sparsity? 2 Mathematical preliminaries 3 l 1 solvers 2 / 41 problem setup finite-dimensional formulation

More information

High resolution traction force microscopy on small focal adhesions improved accuracy through optimal marker distribution and optical flow tracking

High resolution traction force microscopy on small focal adhesions improved accuracy through optimal marker distribution and optical flow tracking Scientific Reports High resolution traction force microscopy on small focal adhesions improved accuracy through optimal marker distribution and optical flow tracking Claude N. Holenstein 1,2,*, Unai Silvan

More information

Wavelet Based Image Restoration Using Cross-Band Operators

Wavelet Based Image Restoration Using Cross-Band Operators 1 Wavelet Based Image Restoration Using Cross-Band Operators Erez Cohen Electrical Engineering Department Technion - Israel Institute of Technology Supervised by Prof. Israel Cohen 2 Layout Introduction

More information

An Overview of Sparsity with Applications to Compression, Restoration, and Inverse Problems

An Overview of Sparsity with Applications to Compression, Restoration, and Inverse Problems An Overview of Sparsity with Applications to Compression, Restoration, and Inverse Problems Justin Romberg Georgia Tech, School of ECE ENS Winter School January 9, 2012 Lyon, France Applied and Computational

More information

A unified formulation of Gaussian vs. sparse stochastic processes Part I: Continuous-domain theory

A unified formulation of Gaussian vs. sparse stochastic processes Part I: Continuous-domain theory A unified formulation of Gaussian vs. 1 sparse stochastic processes Part I: Continuous-domain theory Michael Unser, Fellow, IEEE, Pouya Tafti, Member, IEEE, and Qiyu Sun arxiv:1108.6150v2 [cs.it] 5 Oct

More information

Noise, Image Reconstruction with Noise!

Noise, Image Reconstruction with Noise! Noise, Image Reconstruction with Noise! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 10! Gordon Wetzstein! Stanford University! What s a Pixel?! photon to electron

More information

LINEARIZED BREGMAN ITERATIONS FOR FRAME-BASED IMAGE DEBLURRING

LINEARIZED BREGMAN ITERATIONS FOR FRAME-BASED IMAGE DEBLURRING LINEARIZED BREGMAN ITERATIONS FOR FRAME-BASED IMAGE DEBLURRING JIAN-FENG CAI, STANLEY OSHER, AND ZUOWEI SHEN Abstract. Real images usually have sparse approximations under some tight frame systems derived

More information

Gauge optimization and duality

Gauge optimization and duality 1 / 54 Gauge optimization and duality Junfeng Yang Department of Mathematics Nanjing University Joint with Shiqian Ma, CUHK September, 2015 2 / 54 Outline Introduction Duality Lagrange duality Fenchel

More information

2 Regularized Image Reconstruction for Compressive Imaging and Beyond

2 Regularized Image Reconstruction for Compressive Imaging and Beyond EE 367 / CS 448I Computational Imaging and Display Notes: Compressive Imaging and Regularized Image Reconstruction (lecture ) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement

More information

Recovering overcomplete sparse representations from structured sensing

Recovering overcomplete sparse representations from structured sensing Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix

More information

Statistical Image Recovery: A Message-Passing Perspective. Phil Schniter

Statistical Image Recovery: A Message-Passing Perspective. Phil Schniter Statistical Image Recovery: A Message-Passing Perspective Phil Schniter Collaborators: Sundeep Rangan (NYU) and Alyson Fletcher (UC Santa Cruz) Supported in part by NSF grants CCF-1018368 and NSF grant

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Summary of the ISBI 2013 Grand Challenge on 3D Deconvolution Microscopy

Summary of the ISBI 2013 Grand Challenge on 3D Deconvolution Microscopy Summary of the ISBI 23 Grand Challenge on 3D Deconvolution Microscopy Cédric Vonesch & Stamatios Lefkimmiatis Biomedical Imaging Group EPFL, Lausanne, Switzerland April 7, 23 Thanks to our sponsors! ISBI

More information

Which wavelet bases are the best for image denoising?

Which wavelet bases are the best for image denoising? Which wavelet bases are the best for image denoising? Florian Luisier a, Thierry Blu a, Brigitte Forster b and Michael Unser a a Biomedical Imaging Group (BIG), Ecole Polytechnique Fédérale de Lausanne

More information

Bayesian Paradigm. Maximum A Posteriori Estimation

Bayesian Paradigm. Maximum A Posteriori Estimation Bayesian Paradigm Maximum A Posteriori Estimation Simple acquisition model noise + degradation Constraint minimization or Equivalent formulation Constraint minimization Lagrangian (unconstraint minimization)

More information

Approximate Message Passing Algorithms

Approximate Message Passing Algorithms November 4, 2017 Outline AMP (Donoho et al., 2009, 2010a) Motivations Derivations from a message-passing perspective Limitations Extensions Generalized Approximate Message Passing (GAMP) (Rangan, 2011)

More information

On the coherence barrier and analogue problems in compressed sensing

On the coherence barrier and analogue problems in compressed sensing On the coherence barrier and analogue problems in compressed sensing Clarice Poon University of Cambridge June 1, 2017 Joint work with: Ben Adcock Anders Hansen Bogdan Roman (Simon Fraser) (Cambridge)

More information

Interactions of Information Theory and Estimation in Single- and Multi-user Communications

Interactions of Information Theory and Estimation in Single- and Multi-user Communications Interactions of Information Theory and Estimation in Single- and Multi-user Communications Dongning Guo Department of Electrical Engineering Princeton University March 8, 2004 p 1 Dongning Guo Communications

More information

A Generative Perspective on MRFs in Low-Level Vision Supplemental Material

A Generative Perspective on MRFs in Low-Level Vision Supplemental Material A Generative Perspective on MRFs in Low-Level Vision Supplemental Material Uwe Schmidt Qi Gao Stefan Roth Department of Computer Science, TU Darmstadt 1. Derivations 1.1. Sampling the Prior We first rewrite

More information

Sparsifying Transform Learning for Compressed Sensing MRI

Sparsifying Transform Learning for Compressed Sensing MRI Sparsifying Transform Learning for Compressed Sensing MRI Saiprasad Ravishankar and Yoram Bresler Department of Electrical and Computer Engineering and Coordinated Science Laborarory University of Illinois

More information

Well-posed Bayesian Inverse Problems: Beyond Gaussian Priors

Well-posed Bayesian Inverse Problems: Beyond Gaussian Priors Well-posed Bayesian Inverse Problems: Beyond Gaussian Priors Bamdad Hosseini Department of Mathematics Simon Fraser University, Canada The Institute for Computational Engineering and Sciences University

More information

arxiv: v1 [cs.it] 21 Feb 2013

arxiv: v1 [cs.it] 21 Feb 2013 q-ary Compressive Sensing arxiv:30.568v [cs.it] Feb 03 Youssef Mroueh,, Lorenzo Rosasco, CBCL, CSAIL, Massachusetts Institute of Technology LCSL, Istituto Italiano di Tecnologia and IIT@MIT lab, Istituto

More information

Overview. Optimization-Based Data Analysis. Carlos Fernandez-Granda

Overview. Optimization-Based Data Analysis.   Carlos Fernandez-Granda Overview Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 1/25/2016 Sparsity Denoising Regression Inverse problems Low-rank models Matrix completion

More information

Generalized greedy algorithms.

Generalized greedy algorithms. Generalized greedy algorithms. François-Xavier Dupé & Sandrine Anthoine LIF & I2M Aix-Marseille Université - CNRS - Ecole Centrale Marseille, Marseille ANR Greta Séminaire Parisien des Mathématiques Appliquées

More information

MCMC Sampling for Bayesian Inference using L1-type Priors

MCMC Sampling for Bayesian Inference using L1-type Priors MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling

More information

Proximal tools for image reconstruction in dynamic Positron Emission Tomography

Proximal tools for image reconstruction in dynamic Positron Emission Tomography Proximal tools for image reconstruction in dynamic Positron Emission Tomography Nelly Pustelnik 1 joint work with Caroline Chaux 2, Jean-Christophe Pesquet 3, and Claude Comtat 4 1 Laboratoire de Physique,

More information

Breaking the coherence barrier - A new theory for compressed sensing

Breaking the coherence barrier - A new theory for compressed sensing Breaking the coherence barrier - A new theory for compressed sensing Anders C. Hansen (Cambridge) Joint work with: B. Adcock (Simon Fraser University) C. Poon (Cambridge) B. Roman (Cambridge) UCL-Duke

More information

State Space Representation of Gaussian Processes

State Space Representation of Gaussian Processes State Space Representation of Gaussian Processes Simo Särkkä Department of Biomedical Engineering and Computational Science (BECS) Aalto University, Espoo, Finland June 12th, 2013 Simo Särkkä (Aalto University)

More information

Beyond incoherence and beyond sparsity: compressed sensing in the real world

Beyond incoherence and beyond sparsity: compressed sensing in the real world Beyond incoherence and beyond sparsity: compressed sensing in the real world Clarice Poon 1st November 2013 University of Cambridge, UK Applied Functional and Harmonic Analysis Group Head of Group Anders

More information

( nonlinear constraints)

( nonlinear constraints) Wavelet Design & Applications Basic requirements: Admissibility (single constraint) Orthogonality ( nonlinear constraints) Sparse Representation Smooth functions well approx. by Fourier High-frequency

More information

Superiorized Inversion of the Radon Transform

Superiorized Inversion of the Radon Transform Superiorized Inversion of the Radon Transform Gabor T. Herman Graduate Center, City University of New York March 28, 2017 The Radon Transform in 2D For a function f of two real variables, a real number

More information

Source Reconstruction for 3D Bioluminescence Tomography with Sparse regularization

Source Reconstruction for 3D Bioluminescence Tomography with Sparse regularization 1/33 Source Reconstruction for 3D Bioluminescence Tomography with Sparse regularization Xiaoqun Zhang xqzhang@sjtu.edu.cn Department of Mathematics/Institute of Natural Sciences, Shanghai Jiao Tong University

More information

Sparse linear models

Sparse linear models Sparse linear models Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 2/22/2016 Introduction Linear transforms Frequency representation Short-time

More information

Introduction to Biomedical Engineering

Introduction to Biomedical Engineering Introduction to Biomedical Engineering Biosignal processing Kung-Bin Sung 6/11/2007 1 Outline Chapter 10: Biosignal processing Characteristics of biosignals Frequency domain representation and analysis

More information

Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm

Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm J. K. Pant, W.-S. Lu, and A. Antoniou University of Victoria August 25, 2011 Compressive Sensing 1 University

More information

Fast Angular Synchronization for Phase Retrieval via Incomplete Information

Fast Angular Synchronization for Phase Retrieval via Incomplete Information Fast Angular Synchronization for Phase Retrieval via Incomplete Information Aditya Viswanathan a and Mark Iwen b a Department of Mathematics, Michigan State University; b Department of Mathematics & Department

More information

Pattern Recognition and Machine Learning

Pattern Recognition and Machine Learning Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability

More information

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 66, NO. 4, FEBRUARY 15,

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 66, NO. 4, FEBRUARY 15, IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 66, NO. 4, FEBRUARY 15, 018 1093 Learning Conve Regularizers for Optimal Bayesian Denoising Ha Q. Nguyen, Emrah Bostan, Member, IEEE, and Michael Unser, Fellow,

More information

Deep Learning: Approximation of Functions by Composition

Deep Learning: Approximation of Functions by Composition Deep Learning: Approximation of Functions by Composition Zuowei Shen Department of Mathematics National University of Singapore Outline 1 A brief introduction of approximation theory 2 Deep learning: approximation

More information

Minimizing the Difference of L 1 and L 2 Norms with Applications

Minimizing the Difference of L 1 and L 2 Norms with Applications 1/36 Minimizing the Difference of L 1 and L 2 Norms with Department of Mathematical Sciences University of Texas Dallas May 31, 2017 Partially supported by NSF DMS 1522786 2/36 Outline 1 A nonconvex approach:

More information

Joint Bayesian Compressed Sensing with Prior Estimate

Joint Bayesian Compressed Sensing with Prior Estimate Joint Bayesian Compressed Sensing with Prior Estimate Berkin Bilgic 1, Tobias Kober 2,3, Gunnar Krueger 2,3, Elfar Adalsteinsson 1,4 1 Electrical Engineering and Computer Science, MIT 2 Laboratory for

More information

SEAGLE: Robust Computational Imaging under Multiple Scattering

SEAGLE: Robust Computational Imaging under Multiple Scattering MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com SEAGLE: Robust Computational Imaging under Multiple Scattering Liu, H.-Y.; Liu, D.; Mansour, H.; Boufounos, P.T.; Waller, L.; Kamilov, U. TR217-81

More information

Compressive Sensing and Beyond

Compressive Sensing and Beyond Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered

More information

Denoising of NIRS Measured Biomedical Signals

Denoising of NIRS Measured Biomedical Signals Denoising of NIRS Measured Biomedical Signals Y V Rami reddy 1, Dr.D.VishnuVardhan 2 1 M. Tech, Dept of ECE, JNTUA College of Engineering, Pulivendula, A.P, India 2 Assistant Professor, Dept of ECE, JNTUA

More information

Sensing systems limited by constraints: physical size, time, cost, energy

Sensing systems limited by constraints: physical size, time, cost, energy Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original

More information

Hamburger Beiträge zur Angewandten Mathematik

Hamburger Beiträge zur Angewandten Mathematik Hamburger Beiträge zur Angewandten Mathematik Error Estimates for Filtered Back Projection Matthias Beckmann and Armin Iske Nr. 2015-03 January 2015 Error Estimates for Filtered Back Projection Matthias

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Approximate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery

Approximate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery Approimate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery arxiv:1606.00901v1 [cs.it] Jun 016 Shuai Huang, Trac D. Tran Department of Electrical and Computer Engineering Johns

More information

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Jorge F. Silva and Eduardo Pavez Department of Electrical Engineering Information and Decision Systems Group Universidad

More information

A New Look at First Order Methods Lifting the Lipschitz Gradient Continuity Restriction

A New Look at First Order Methods Lifting the Lipschitz Gradient Continuity Restriction A New Look at First Order Methods Lifting the Lipschitz Gradient Continuity Restriction Marc Teboulle School of Mathematical Sciences Tel Aviv University Joint work with H. Bauschke and J. Bolte Optimization

More information

Recent developments on sparse representation

Recent developments on sparse representation Recent developments on sparse representation Zeng Tieyong Department of Mathematics, Hong Kong Baptist University Email: zeng@hkbu.edu.hk Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last

More information

Unbiased Risk Estimation as Parameter Choice Rule for Filter-based Regularization Methods

Unbiased Risk Estimation as Parameter Choice Rule for Filter-based Regularization Methods Unbiased Risk Estimation as Parameter Choice Rule for Filter-based Regularization Methods Frank Werner 1 Statistical Inverse Problems in Biophysics Group Max Planck Institute for Biophysical Chemistry,

More information

ECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference

ECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference ECE 18-898G: Special Topics in Signal Processing: Sparsity, Structure, and Inference Sparse Recovery using L1 minimization - algorithms Yuejie Chi Department of Electrical and Computer Engineering Spring

More information

SPARSE SIGNAL RESTORATION. 1. Introduction

SPARSE SIGNAL RESTORATION. 1. Introduction SPARSE SIGNAL RESTORATION IVAN W. SELESNICK 1. Introduction These notes describe an approach for the restoration of degraded signals using sparsity. This approach, which has become quite popular, is useful

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 254 Part V

More information

Variational Image Restoration

Variational Image Restoration Variational Image Restoration Yuling Jiao yljiaostatistics@znufe.edu.cn School of and Statistics and Mathematics ZNUFE Dec 30, 2014 Outline 1 1 Classical Variational Restoration Models and Algorithms 1.1

More information

Signal Denoising with Wavelets

Signal Denoising with Wavelets Signal Denoising with Wavelets Selin Aviyente Department of Electrical and Computer Engineering Michigan State University March 30, 2010 Introduction Assume an additive noise model: x[n] = f [n] + w[n]

More information

Compressive Imaging by Generalized Total Variation Minimization

Compressive Imaging by Generalized Total Variation Minimization 1 / 23 Compressive Imaging by Generalized Total Variation Minimization Jie Yan and Wu-Sheng Lu Department of Electrical and Computer Engineering University of Victoria, Victoria, BC, Canada APCCAS 2014,

More information

Index. p, lip, 78 8 function, 107 v, 7-8 w, 7-8 i,7-8 sine, 43 Bo,94-96

Index. p, lip, 78 8 function, 107 v, 7-8 w, 7-8 i,7-8 sine, 43 Bo,94-96 p, lip, 78 8 function, 107 v, 7-8 w, 7-8 i,7-8 sine, 43 Bo,94-96 B 1,94-96 M,94-96 B oro!' 94-96 BIro!' 94-96 I/r, 79 2D linear system, 56 2D FFT, 119 2D Fourier transform, 1, 12, 18,91 2D sinc, 107, 112

More information

Continuous State MRF s

Continuous State MRF s EE64 Digital Image Processing II: Purdue University VISE - December 4, Continuous State MRF s Topics to be covered: Quadratic functions Non-Convex functions Continuous MAP estimation Convex functions EE64

More information

Optimization for Learning and Big Data

Optimization for Learning and Big Data Optimization for Learning and Big Data Donald Goldfarb Department of IEOR Columbia University Department of Mathematics Distinguished Lecture Series May 17-19, 2016. Lecture 1. First-Order Methods for

More information

Restoration of Missing Data in Limited Angle Tomography Based on Helgason- Ludwig Consistency Conditions

Restoration of Missing Data in Limited Angle Tomography Based on Helgason- Ludwig Consistency Conditions Restoration of Missing Data in Limited Angle Tomography Based on Helgason- Ludwig Consistency Conditions Yixing Huang, Oliver Taubmann, Xiaolin Huang, Guenter Lauritsch, Andreas Maier 26.01. 2017 Pattern

More information

Compressed Sensing in Astronomy

Compressed Sensing in Astronomy Compressed Sensing in Astronomy J.-L. Starck CEA, IRFU, Service d'astrophysique, France jstarck@cea.fr http://jstarck.free.fr Collaborators: J. Bobin, CEA. Introduction: Compressed Sensing (CS) Sparse

More information

State-Space Methods for Inferring Spike Trains from Calcium Imaging

State-Space Methods for Inferring Spike Trains from Calcium Imaging State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline

More information

Multichannel Sampling of Signals with Finite. Rate of Innovation

Multichannel Sampling of Signals with Finite. Rate of Innovation Multichannel Sampling of Signals with Finite 1 Rate of Innovation Hojjat Akhondi Asl, Pier Luigi Dragotti and Loic Baboulaz EDICS: DSP-TFSR, DSP-SAMP. Abstract In this paper we present a possible extension

More information

EE 4372 Tomography. Carlos E. Davila, Dept. of Electrical Engineering Southern Methodist University

EE 4372 Tomography. Carlos E. Davila, Dept. of Electrical Engineering Southern Methodist University EE 4372 Tomography Carlos E. Davila, Dept. of Electrical Engineering Southern Methodist University EE 4372, SMU Department of Electrical Engineering 86 Tomography: Background 1-D Fourier Transform: F(

More information

An Introduction to Sparse Approximation

An Introduction to Sparse Approximation An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,

More information

Inexact Alternating Direction Method of Multipliers for Separable Convex Optimization

Inexact Alternating Direction Method of Multipliers for Separable Convex Optimization Inexact Alternating Direction Method of Multipliers for Separable Convex Optimization Hongchao Zhang hozhang@math.lsu.edu Department of Mathematics Center for Computation and Technology Louisiana State

More information

Exam. Matrikelnummer: Points. Question Bonus. Total. Grade. Information Theory and Signal Reconstruction Summer term 2013

Exam. Matrikelnummer: Points. Question Bonus. Total. Grade. Information Theory and Signal Reconstruction Summer term 2013 Exam Name: Matrikelnummer: Question 1 2 3 4 5 Bonus Points Total Grade 1/6 Question 1 You are traveling to the beautiful country of Markovia. Your travel guide tells you that the weather w i in Markovia

More information

Bayesian X-ray Computed Tomography using a Three-level Hierarchical Prior Model

Bayesian X-ray Computed Tomography using a Three-level Hierarchical Prior Model L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 1/26 Bayesian X-ray Computed Tomography using a Three-level Hierarchical Prior Model Li Wang, Ali Mohammad-Djafari, Nicolas Gac Laboratoire

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

On the FPA infrared camera transfer function calculation

On the FPA infrared camera transfer function calculation On the FPA infrared camera transfer function calculation (1) CERTES, Université Paris XII Val de Marne, Créteil, France (2) LTM, Université de Bourgogne, Le Creusot, France by S. Datcu 1, L. Ibos 1,Y.

More information

Optimal mass transport as a distance measure between images

Optimal mass transport as a distance measure between images Optimal mass transport as a distance measure between images Axel Ringh 1 1 Department of Mathematics, KTH Royal Institute of Technology, Stockholm, Sweden. 21 st of June 2018 INRIA, Sophia-Antipolis, France

More information

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas M.Vidyasagar@utdallas.edu www.utdallas.edu/ m.vidyasagar

More information

Rigorous Dynamics and Consistent Estimation in Arbitrarily Conditioned Linear Systems

Rigorous Dynamics and Consistent Estimation in Arbitrarily Conditioned Linear Systems 1 Rigorous Dynamics and Consistent Estimation in Arbitrarily Conditioned Linear Systems Alyson K. Fletcher, Mojtaba Sahraee-Ardakan, Philip Schniter, and Sundeep Rangan Abstract arxiv:1706.06054v1 cs.it

More information

Independent Component Analysis. Contents

Independent Component Analysis. Contents Contents Preface xvii 1 Introduction 1 1.1 Linear representation of multivariate data 1 1.1.1 The general statistical setting 1 1.1.2 Dimension reduction methods 2 1.1.3 Independence as a guiding principle

More information

Recent Advances in Bayesian Inference for Inverse Problems

Recent Advances in Bayesian Inference for Inverse Problems Recent Advances in Bayesian Inference for Inverse Problems Felix Lucka University College London, UK f.lucka@ucl.ac.uk Applied Inverse Problems Helsinki, May 25, 2015 Bayesian Inference for Inverse Problems

More information