Sparse stochastic processes: A statistical framework for modern signal processing
|
|
- Ophelia Martin
- 5 years ago
- Views:
Transcription
1 Sparse stochastic processes: A statistical framework for modern signal processing Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Plenary talk, Int. Conf. Syst. Sig. Im. Proc. (IWSSIP), Bucharest, July 7-9, 03 0th century statistical signal processing Hypothesis: Signal = stationary Gaussian process Karhunen-Loève transform (KLT) is optimal for compression DCT asymptotically equivalent to KLT (Ahmed-Rao, 97; U., 98) (Pearl et al., IEEE Trans. Com 97)
2 0th century statistical signal processing (cont d) Hypothesis: Signal = Gaussian process y = Hs + n Noise: i.i.d. Gaussian with variance Signal covariance: C s = E{s s T } Wiener filter is optimal for restoration/denoising s LMMSE = C s H T HC s H T + I y = FWiener y Wiener (LMMSE) solution = Gauss MMSE = Gauss MAP s MAP = arg min s ky Hsk {z } Data Log likelihood + kc / s sk {z } Gaussian prior likelihood, quadratic regularization (Tikhonov) 3 Then came wavelets... and sparsity Alfred Haar 90 // 98 Stéphane Mallat Ingrid Daubechies Sparsity 006 Applications Compressed sensing Martin Vetterli David Donoho Emmanuel Candès
3 Fact : Wavelets can outperform Wiener filter w = T (w) w Fact : Wavelet coding can outperform jpeg f(x) = X i,k i,k(x) w i,k Wavelet transform Inverse wavelet transform Discarding small coefficients (Shapiro, IEEE-IP 993) 6
4 Fact 3: l schemes can outperform l optimization s? = argmin ky Hsk + R(s) {z } {z } data consistency regularization Wavelet-domain regularization Wavelet expansion: s = Wv (typically, sparse) Wavelet-domain sparsity-constraint: R(s) with v = W s =kvk` Iterated shrinkage-thresholding algorithm (ISTA, FISTA) (Nowak et al., Daubechies et al. 00) ` regularization (Total variation) R(s) =klsk` with L: gradient (Rudin-Osher, 99) Iterative reweighted least squares (IRLS) or FISTA 7 Quest for a unifying framework: the precursor Gaussian stationary processes as a filtered white noise s(t) =(h w)(t) s(!) = H(!) w(!) / H(!) Z Frequency response (shaping filter): H(!) = h(t)e j!t dt Whitening operator Innovation: Ls(t) =w(t) L(!) = H(!) R w is Gaussian stationary and independent at every point 8
5 Continuous-domain innovation model Shaping filter Generalized white noise w(x) L { } (appropriate boundary conditions) Whitening operator L{ } Stochastic process s(x), x R d Our work: the non-gaussian part of this story Main finding: it is necessarily sparse (infinitely divisible) Why?... as explained by our current research... (invoking powerful theorems in functional analysis: Bochner-Minlos, Gelfand, Schoenberg & Lévy-Khinchine) 9 Innovation-based synthesis of splines (x) L= d dx =D L : integrator L { } =L Impulse response (x x 0 ) Translation invariance L { } (x x 0 ) X a[k] (x x k ) kz Linearity L { } s(x) = X kz a[k] (x x k ) 0
6 OUTLINE Sparse stochastic processes Generalized innovation model Gelfand s theory of generalized stochastic processes Statistical characterization of sparse stochastic processes Lévy processes and their generalization Fractal processes: Gaussian vs. sparse Applications Modeling of signals (audio) Algorithms for sparse signal recovery as MAP estimators Optimal denoising (MMSE) Sparse representations, optimal transforms Sparse stochastic processes
7 Road map for theory of sparse processes Specification of inverse operator Functional analysis solution of SDE 3 Characterization of generalized stochastic process White noise w L Mixing operator Whitening operator L s =L w Very easy! (after solving. &.) Multi-scale wavelet analysis Characterization of continuous-domain white noise Higher mathematics: generalized functions (Schwartz) measures on topological vector spaces Gelfand s theory of generalized stochastic processes Infinite divisibility (Lévy-Khintchine formula) Characterization of transform-domain statistics Easy when: i =L i 3 Generalized innovation process Difficulty : w 6= w(x) is too rough to have a pointwise interpretation Difficulty : w is an infinite-dimensional random entity; its pdf can be formally specified by a measure P w (E) where E S 0 (R d ) Axiomatic definition (Gelfand-Vilenkin 96) w is a generalized innovation process (or continuous-domain white noise) over S 0 (R d ) if. Observability : X = hw, 'i is a well-defined random variable for any test function ' S(R d ).. Stationarity : X x0 = hw, '( x 0 )i is identically distributed for all x 0 R d. 3. Independent atoms : X = hw, ' i and X = hw, ' i are independent whenever ' and ' have non-intersecting support. X = h, i X = h, i Characteristic functional (!! ' ) Z dp w (') =E{e jhw,'i } = e jhs,'i P w (ds) S 0
8 Defining Gaussian noise: discrete vs. continuous Lévy exponent:! log p X (!) = f (!) = Discrete white Gaussian noise X = (X,..., XN ) with Xn i.i.d standardized Gaussian Characteristic function: p X (!) = g(!) = exp PN n= k!k f (!n ) = e Continuous-domain white Gaussian noise 0 Infinite-dimensional entity w with generic observations Xn = hw, 'n i d Characteristic functional: P w (') = G(') = e k'kl = exp cw (!'n ) = e p Xn (!) = E{ej!hw,'n i } = E{ejhw,!'n i } = P Z R f '(x) dx! k'n kl Characterization of generalized innovation X' = hw, 'i = h lim h = i, lim h, n! n! i + + h, i,, i Theorem Let w be a generalized stochastic process such that Xid = hw, recti is well-defined. Then, w is a generalized innovation (white noise) over S 0 (Rd ) if and only if its char- acteristic form is given by jhw,'i d P } = exp w (') = E{e Z Rd f '(x) dx where f (!) is a valid Le vy exponent (in fact, the Le vy exponent of Xid ). Moreover, the random variables X' = hw, 'i are all infinitely divisible with modified Le vy exponent f' (!) = Z Rd f!'(x) dx (Gelfand-Vilenkin 96; Amini-U. under review) 6
9 Examples of id noise distributions Observations: pid (x) Xn = hw, rect( (a) Gaussian n)i ! f ( ) = log +! f (!) = (b) Laplace Sparser (c) Compound Poisson 0.7 R f ( ) = 0. 0 R (ejx )p(x)dx (d) Cauchy (stable) 0.30 f (!) = s! d Complete mathematical characterization: P w ( ) = exp Z Rd (x) dx f 7 Steps + 3: Characterization of sparse process L s=l White noise w w Whitening operator L Abstract formulation of innovation model s=l ) S, w, s =, L cs (') = E{ejhs,'i } = P d P w (L ') = exp w = L, w {z } Z Rd f L '(x) dx Sufficient condition for existence: L continuous operator: S(Rd )! Lp (Rd ) (U.-Tafti-Sun, preprint ArXiv 0) 8
10 Example : Lévy processes Ds = w (unstable SDE!) s =D0 w S,,s = D0,w D 0 '(t) = Z t 0 '( )d White noise (innovation) Lévy process Gaussian Brownian motion 0 0 (Wiener 93) Integrator Impulsive w(t) Z t 0 d s(t) Compound Poisson S S (Cauchy) Lévy flight (Paul Lévy circa 930) 9 Example : Self-similar processes L F! (j!) H+ ) L : fractional integrator fbm; H = 0.0 Poisson; H = 0.0 fbm; H = 0.7 fbm; H =. fbm; H =.0 H=. H=.7 H=. H=. Poisson; H = 0.7 Poisson; H =. Poisson; H =.0 Gaussian Sparse (generalized Poisson) Fractional Brownian motion (Mandelbrot, 968) (U.-Tafti, IEEE-SP 00) 0
11 Scale- and rotation-invariant processes Stochastic partial differential equation : ( ) H+ s(x) =w(x) Gaussian H=. H=.7 H=. H=.7 Sparse (generalized Poisson) (U.-Tafti, IEEE-SP 00) A brief panorama of applications
12 A. Signal modeling (Audio) Sparse, bandpass processes L= dn dt n + a n dt n + + a d dt + a 0I d n (a) Gaussian (b) Alpha stable α=. Mixed sparse processes: s mix = s + + s M cp smix (') = MY m= cp sm (') =exp Z MX R m= f m Lm '(t) dt! Gaussian (Am) generalized Lévy (Am, SαS) 3 A. Biomedical imaging: MAP reconstruction Innovation model of the signal Ls = w s = L w Signal decoupling: discrete version of operator u(x) =L d s(x), u = Ls (matrix notation) Generalized B-spline L =L d L Statistical characterization - X =[u] n identically distributed (approx. independent) - Probability density function: p X (x) =F { d P w (! L )}(x) - Potential function: X(x) = log p X (x) Maximum a posteriori (MAP) estimator for AWN s = argmin kg Hsk + P n X([Ls] n ) (Bostan et al., IEEE-IP 03)
13 Sparser MAP estimator: special cases s = argmin kg Hsk + P n X([Ls] n ) Gaussian: p X (x) = 0 e x /( 0 ) X(x) = x 0 Laplace: p X (x) = e x X(x) = x Student: p X (x) = B r, r+ x + X(x) = r + log( + x ) Student potentials: r =,, 8, 3 (fixed variance) MRI: Shepp-Logan phantom Original SL Phantom Fourier Sampling Pattern Angles L : gradient Optimized parameters Laplace prior (TV) Student prior (log)
14 MRI reconstruction Real T Brain Image MR Angiography Image k-space sampling pattern 0 radial lines Reconstruction results in db L : gradient Optimized parameters Gaussian Estimator Laplace Estimator Student s Estimator T brain Image MR Angiography Image D deconvolution experiment Astrocytes cells bovine pulmonary artery cells human embryonic stem cells Disk shaped PSF (7x7) Deconvolution results in db L : gradient Optimized parameters Gaussian Estimator Laplace Estimator Student s Estimator Astrocytes cells Pulmonary cells Stem cells (Bostan et al., IEEE-IP 03)
15 A3. Optimal denoising: MMSE formulation Measurement model: y = x + n n: additive white Gaussian noise x: sparse first-order process (e.g., Lévy flight) u = Ls: discrete innovation (i.i.d. and infinite divisible) p Y X (y x ) p Y X (y x ) p Y X (y 3 x 3 ) Optimal estimator: x MMSE = E{x y} 6 µ X (x) µ + X (x) p(x y) = Z NY Y N p Y X y n x n n= n= p U x n x n {z } u n X 3 X X 3 p U (x ) p U (x x ) p U (x 3 x ) Belief propagation algorithm SNR [db] 3 MMSE Log = MAP TV LMMSE Lévy process with Cauchy increments AWGN n (Kamilov et al., IEEE-SP 03) 9 A. Sparse representations, optimal transforms Innovation model (SDE) Ls = w s = L w Admissible basis function: i,k =L i,k with i,k L p (R d ) Signal expansion = equivalent white-noise analysis Y = h i,k,si = hl i,k, L wi = h i,k,wi =) ˆp Y (!) = d P w (! i,k ) Statistical implications Transform-domain pdfs are infinitely divisible Quality of decoupling depends upon support of wavelet/smoothing kernel i,k 30
16 Operator-like wavelets for sparse AR() processes Innovation model: Ls = w, s =L w with L = (D I) Operator-like wavelet: i =L i with i: smoothing kernel (Khalidov-U., 006) Wavelet analysis: hs, i ( t 0 )i = hl w, L i( t 0 )i = hw, i( t 0 )i 0 =L 0 Haar! 0 (x) =L (a) (b) 3 Orthogonal expansion of a SαS Lévy process Mutual Information 0 α sparser Gaussian (Pad-U. ICASSP 3, SPARS 3)
17 Orthogonal expansion of a SαS AR() process Mutual Information Identity DCT/KLT Haar Wavelet Operator like Wavelet Optimal (ICA) α (Pad-U. ICASSP 3, SPARS 3) sparser Gaussian e =0.9, M = 6 CONCLUSION! Unifying continuous-domain innovation model! Backward compatibility with classical Gaussian theory! Operator-based formulation: Lévy-driven SDEs! Gaussian vs. sparse (generalized Poisson, student, SαS)! Focus on unstable SDEs non-stationary, self-similar processes! Wavelet analysis vs. regularization! Central role of B-spline (see papers)! Sparsification/decoupling via operator-like behavior! Theoretical framework for sparse signal recovery! Analytical determination of PDF in any transformed domain! Predictive power: transform coding/denoising (facts,, 3)! New statistically-founded sparsity priors! Derivation of estimators (MAP vs. MMSE): link with LASSO and l methods for sparse signal recovery (Compressed sensing) 3
18 References Theory of generalized stochastic processes M. Unser, P. Tafti, and Q. Sun, A unified formulation of Gaussian vs. sparse stochastic processes Part I: Continuous-domain theory, preprint, available at M. Unser, P. Tafti, A. Amini, H. Kirshner, A Unified Formulation of Gaussian vs. Sparse Stochastic Processes Part II: Discrete-Domain Theory, preprint, available at M. Unser, P.D. Tafti, Stochastic models for sparse and piecewise-smooth signals, IEEE Trans. Signal Processing, vol. 9, no. 3, pp , March 0. Q. Sun, M. Unser, Left-Inverses of Fractional Laplacian and Sparse Stochastic Processes, Advances in Computational Mathematics, vol. 36, no. 3, pp. 399-, April 0. Recent applications U.S. Kamilov, P. Pad, A. Amini, M. Unser, MMSE Estimation of Sparse Lévy Processes, IEEE Trans. Signal Processing, vol. 6, no., pp. 37-7, 03. A. Amini, U.S. Kamilov, E. Bostan, M. Unser, Bayesian Estimation for Continuous-Time Sparse Stochastic Processes, IEEE Transactions on Signal Processing, vol. 6, no., pp , 03. E. Bostan, U.S. Kamilov, M. Nilchian, M. Unser, Sparse Stochastic Processes and Discretization of Linear Inverse Problems, IEEE Trans. Image Processing, in press. P. Pad, M. Unser, On the Optimality of Operator-Like Wavelets for Sparse AR() Processes, Proc. IEEE Int. Conf. Acoust. Speech Sig. Proc. (ICASSP 3), Vancouver, Canada, May, 03, in press. 3 Acknowledgments Many thanks to " Dr. Pouya Tafti " Prof. Qiyu Sun " Prof. Thierry Blu " Emrah Bostan " Dr. Matthieu Guerquin-Kern " Dr. Arash Amini " Dr. Hagai Kirshner " Pedram Pad " Ulugbek Kamilov " Masih Nilchian " Members of EPFL s Biomedical Imaging Group! Preprints and demos: 36
19 ebook: web preprint 37
Sparse stochastic processes
BIOMEDICAL IMAGING GROUP (BIG) LABORATOIRE D IMAGERIE BIOMEDICALE EPFL LIB Bât. BM 4.127 CH 115 Lausanne Switzerland Téléphone : Fax : E-mail : Site web : + 4121 693 51 5 + 4121 693 37 1 Michael.Unser@epfl.ch
More informationSparse stochastic processes with application to biomedical imaging
Sparse stochastic processes with application to biomedical imaging Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Plenary talk, ICIAR, October -, Vilamoura, Algarve, Portugal Variational
More informationSparse stochastic processes
Sparse stochastic processes Part II: Functional characterization of sparse stochastic processes Prof. Michael Unser, LIB Ecole CIMPA, Analyse et Probabilité, Abidjan 204 oad map for theory of sparse processes
More informationSparse modeling and the resolution of inverse problems in biomedical imaging
Sparse modeling and the resolution of inverse problems in biomedical imaging Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Plenary talk, IEEE Int. Symp. Biomedical Imaging (ISBI 5),
More informationA Unified Formulation of Gaussian Versus Sparse Stochastic Processes
A Unified Formulation of Gaussian Versus Sparse Stochastic Processes Michael Unser, Pouya Tafti and Qiyu Sun EPFL, UCF Appears in IEEE Trans. on Information Theory, 2014 Presented by Liming Wang M. Unser,
More informationVersion beta 1.6 (December 6, 2013). This is a preproduction manuscript of a work protected by copyright. No reproduction of any part may take place
Version beta 1.6 (December 6, 2013). This is a preproduction manuscript of a work protected by copyright. No reproduction of any part may take place without the written permission of Cambridge University
More informationThe Analog Formulation of Sparsity Implies Infinite Divisibility and Rules Out Bernoulli-Gaussian Priors
The Analog Formulation of Sparsity Implies Infinite Divisibility and ules Out Bernoulli-Gaussian Priors Arash Amini, Ulugbek S. Kamilov, and Michael Unser Biomedical Imaging Group BIG) École polytechnique
More informationMMSEEstimationofSparseLévyProcesses
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 61, NO. 1, JANUARY 1, 2013 137 MMSEEstimationofSparseLévyProcesses Ulugbek S. Kamilov, Student Member, IEEE, Pedram Pad, Student Member, IEEE, Arash Amini,
More informationNew representer theorems: From compressed sensing to deep learning
New representer theorems: From compressed sensing to deep learning Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Joint work with Julien Fageot, John-Paul Ward and Kyong Jin Mathematisches
More informationThe rise and fall of regularizations: l1 vs. l2 representer theorems
The rise and fall of regularizations: l1 vs. l2 representer theorems Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Joint work with Julien Fageot, John-Paul Ward, and Harshit Gupta
More informationStochastic Models for Sparse and Piecewise-Smooth Signals
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 59, NO. 3, MARCH 2011 989 Stochastic Models for Sparse and Piecewise-Smooth Signals Michael Unser, Fellow, IEEE, and Pouya Dehghani Tafti, Member, IEEE Abstract
More informationMinimizing Isotropic Total Variation without Subiterations
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Minimizing Isotropic Total Variation without Subiterations Kamilov, U. S. TR206-09 August 206 Abstract Total variation (TV) is one of the most
More informationLearning MMSE Optimal Thresholds for FISTA
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Learning MMSE Optimal Thresholds for FISTA Kamilov, U.; Mansour, H. TR2016-111 August 2016 Abstract Fast iterative shrinkage/thresholding algorithm
More informationMMSE Denoising of 2-D Signals Using Consistent Cycle Spinning Algorithm
Denoising of 2-D Signals Using Consistent Cycle Spinning Algorithm Bodduluri Asha, B. Leela kumari Abstract: It is well known that in a real world signals do not exist without noise, which may be negligible
More informationAn introduction to sparse stochastic
An introduction to sparse stochastic processes Michael Unser and Pouya Tafti Copyright 2012 Michael Unser and Pouya Tafti. These materials are protected by copyright under the Attribution-NonCommercial-NoDerivs
More informationWavelets and differential operators: from fractals to Marr!s primal sketch
Wavelets and differential operators: from fractals to Marrs primal sketch Michael Unser Biomedical Imaging Group EPFL, Lausanne, Switzerland Joint work with Pouya Tafti and Dimitri Van De Ville Plenary
More informationA unified formulation of Gaussian vs. sparse stochastic processes Part I: Continuous-domain theory
A unified formulation of Gaussian vs. 1 sparse stochastic processes Part I: Continuous-domain theory Michael Unser, Fellow, IEEE, Pouya Tafti, Member, IEEE, and Qiyu Sun arxiv:1108.6150v2 [cs.it] 5 Oct
More informationSTATISTICAL inference is a central theme to the theory. Learning Convex Regularizers for Optimal Bayesian Denoising
Learning Convex Regularizers for Optimal Bayesian Denoising Ha Q. Nguyen, Emrah Bostan, Member, IEEE, and Michael Unser, Fellow, IEEE arxiv:7.9v [cs.lg] 6 May 7 Abstract We propose a data-driven algorithm
More informationWhich wavelet bases are the best for image denoising?
Which wavelet bases are the best for image denoising? Florian Luisier a, Thierry Blu a, Brigitte Forster b and Michael Unser a a Biomedical Imaging Group (BIG), Ecole Polytechnique Fédérale de Lausanne
More informationGaussian and Sparse Processes Are Limits of Generalized Poisson Processes
1 Gaussian and Sparse Processes Are Limits of Generalized Poisson Processes Julien Fageot, Virginie Uhlmann, and Michael Unser arxiv:1702.05003v1 [math.pr] 16 Feb 2017 Abstract The theory of sparse stochastic
More informationSparse linear models
Sparse linear models Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 2/22/2016 Introduction Linear transforms Frequency representation Short-time
More informationSparsity Measure and the Detection of Significant Data
Sparsity Measure and the Detection of Significant Data Abdourrahmane Atto, Dominique Pastor, Grégoire Mercier To cite this version: Abdourrahmane Atto, Dominique Pastor, Grégoire Mercier. Sparsity Measure
More informationWavelet Based Image Restoration Using Cross-Band Operators
1 Wavelet Based Image Restoration Using Cross-Band Operators Erez Cohen Electrical Engineering Department Technion - Israel Institute of Technology Supervised by Prof. Israel Cohen 2 Layout Introduction
More informationStatistics of Stochastic Processes
Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function
More information11 Wavelet-domain methods
Wavelet-domain methods One of the simplest and surprisingly effective approach for removing noise in images is to expand the signal in an orthogonal wavelet basis, to apply a soft-threshold to the wavelet
More informationAdaptive wavelet decompositions of stochastic processes and some applications
Adaptive wavelet decompositions of stochastic processes and some applications Vladas Pipiras University of North Carolina at Chapel Hill SCAM meeting, June 1, 2012 (joint work with G. Didier, P. Abry)
More informationApproximate Message Passing Algorithms
November 4, 2017 Outline AMP (Donoho et al., 2009, 2010a) Motivations Derivations from a message-passing perspective Limitations Extensions Generalized Approximate Message Passing (GAMP) (Rangan, 2011)
More informationBayesian Methods for Sparse Signal Recovery
Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Jason Palmer, Zhilin Zhang and Ritwik Giri Motivation Motivation Sparse Signal Recovery
More informationOn the Linearity of Bayesian Interpolators for Non-Gaussian Continuous-Time AR(1) Processes
IEEE TANSACTION ON INFOMATION THEOY, VOL.?, NO.?,???? 22 On the Linearity of Bayesian Interpolators for Non-Gaussian Continuous-Time A() Processes Arash Amini, Philippe Thévenaz, John Paul Ward, and Michael
More informationSystem Identification and Adaptive Filtering in the Short-Time Fourier Transform Domain
System Identification and Adaptive Filtering in the Short-Time Fourier Transform Domain Electrical Engineering Department Technion - Israel Institute of Technology Supervised by: Prof. Israel Cohen Outline
More informationOn the coherence barrier and analogue problems in compressed sensing
On the coherence barrier and analogue problems in compressed sensing Clarice Poon University of Cambridge June 1, 2017 Joint work with: Ben Adcock Anders Hansen Bogdan Roman (Simon Fraser) (Cambridge)
More informationCompressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach
Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach Asilomar 2011 Jason T. Parker (AFRL/RYAP) Philip Schniter (OSU) Volkan Cevher (EPFL) Problem Statement Traditional
More informationModeling Multiscale Differential Pixel Statistics
Modeling Multiscale Differential Pixel Statistics David Odom a and Peyman Milanfar a a Electrical Engineering Department, University of California, Santa Cruz CA. 95064 USA ABSTRACT The statistics of natural
More informationGREEDY SPARSE RECONSTRUCTION OF NON-NEGATIVE SIGNALS USING SYMMETRIC ALPHA-STABLE DISTRIBUTIONS
GREEDY SPARSE RECONSTRUCTION OF NON-NEGATIVE SIGNALS USING SYMMETRIC ALPHA-STABLE DISTRIBUTIONS George Tzagkarakis and Panagiotis Tsakalides Department of Computer Science, University of Crete & Institute
More informationKarhunen-Loève Expansions of Lévy Processes
Karhunen-Loève Expansions of Lévy Processes Daniel Hackmann June 2016, Barcelona supported by the Austrian Science Fund (FWF), Project F5509-N26 Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 0 / 40
More informationSparse & Redundant Signal Representation, and its Role in Image Processing
Sparse & Redundant Signal Representation, and its Role in Michael Elad The CS Department The Technion Israel Institute of technology Haifa 3000, Israel Wave 006 Wavelet and Applications Ecole Polytechnique
More informationDNNs for Sparse Coding and Dictionary Learning
DNNs for Sparse Coding and Dictionary Learning Subhadip Mukherjee, Debabrata Mahapatra, and Chandra Sekhar Seelamantula Department of Electrical Engineering, Indian Institute of Science, Bangalore 5612,
More informationStrengthened Sobolev inequalities for a random subspace of functions
Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)
More informationWavelet Footprints: Theory, Algorithms, and Applications
1306 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 51, NO. 5, MAY 2003 Wavelet Footprints: Theory, Algorithms, and Applications Pier Luigi Dragotti, Member, IEEE, and Martin Vetterli, Fellow, IEEE Abstract
More informationEE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6)
EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement to the material discussed in
More informationTutorial: Sparse Signal Processing Part 1: Sparse Signal Representation. Pier Luigi Dragotti Imperial College London
Tutorial: Sparse Signal Processing Part 1: Sparse Signal Representation Pier Luigi Dragotti Imperial College London Outline Part 1: Sparse Signal Representation ~90min Part 2: Sparse Sampling ~90min 2
More informationLarge-Scale L1-Related Minimization in Compressive Sensing and Beyond
Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March
More informationEstimation Error Bounds for Frame Denoising
Estimation Error Bounds for Frame Denoising Alyson K. Fletcher and Kannan Ramchandran {alyson,kannanr}@eecs.berkeley.edu Berkeley Audio-Visual Signal Processing and Communication Systems group Department
More informationMultichannel Sampling of Signals with Finite. Rate of Innovation
Multichannel Sampling of Signals with Finite 1 Rate of Innovation Hojjat Akhondi Asl, Pier Luigi Dragotti and Loic Baboulaz EDICS: DSP-TFSR, DSP-SAMP. Abstract In this paper we present a possible extension
More informationSparse Time-Frequency Transforms and Applications.
Sparse Time-Frequency Transforms and Applications. Bruno Torrésani http://www.cmi.univ-mrs.fr/~torresan LATP, Université de Provence, Marseille DAFx, Montreal, September 2006 B. Torrésani (LATP Marseille)
More informationIndependent Component Analysis. Contents
Contents Preface xvii 1 Introduction 1 1.1 Linear representation of multivariate data 1 1.1.1 The general statistical setting 1 1.1.2 Dimension reduction methods 2 1.1.3 Independence as a guiding principle
More informationBayesian estimation of chaotic signals generated by piecewise-linear maps
Signal Processing 83 2003 659 664 www.elsevier.com/locate/sigpro Short communication Bayesian estimation of chaotic signals generated by piecewise-linear maps Carlos Pantaleon, Luis Vielva, David Luengo,
More informationLow-Complexity Image Denoising via Analytical Form of Generalized Gaussian Random Vectors in AWGN
Low-Complexity Image Denoising via Analytical Form of Generalized Gaussian Random Vectors in AWGN PICHID KITTISUWAN Rajamangala University of Technology (Ratanakosin), Department of Telecommunication Engineering,
More informationProbability and Statistics for Final Year Engineering Students
Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms
More informationLong-Range Dependence and Self-Similarity. c Vladas Pipiras and Murad S. Taqqu
Long-Range Dependence and Self-Similarity c Vladas Pipiras and Murad S. Taqqu January 24, 2016 Contents Contents 2 Preface 8 List of abbreviations 10 Notation 11 1 A brief overview of times series and
More informationPhil Schniter. Supported in part by NSF grants IIP , CCF , and CCF
AMP-inspired Deep Networks, with Comms Applications Phil Schniter Collaborators: Sundeep Rangan (NYU), Alyson Fletcher (UCLA), Mark Borgerding (OSU) Supported in part by NSF grants IIP-1539960, CCF-1527162,
More informationState Space Representation of Gaussian Processes
State Space Representation of Gaussian Processes Simo Särkkä Department of Biomedical Engineering and Computational Science (BECS) Aalto University, Espoo, Finland June 12th, 2013 Simo Särkkä (Aalto University)
More informationDirectionlets. Anisotropic Multi-directional Representation of Images with Separable Filtering. Vladan Velisavljević Deutsche Telekom, Laboratories
Directionlets Anisotropic Multi-directional Representation of Images with Separable Filtering Vladan Velisavljević Deutsche Telekom, Laboratories Google Inc. Mountain View, CA October 2006 Collaborators
More informationRegularizing inverse problems using sparsity-based signal models
Regularizing inverse problems using sparsity-based signal models Jeffrey A. Fessler William L. Root Professor of EECS EECS Dept., BME Dept., Dept. of Radiology University of Michigan http://web.eecs.umich.edu/
More informationSatellite image deconvolution using complex wavelet packets
Satellite image deconvolution using complex wavelet packets André Jalobeanu, Laure Blanc-Féraud, Josiane Zerubia ARIANA research group INRIA Sophia Antipolis, France CNRS / INRIA / UNSA www.inria.fr/ariana
More informationWell-posed Bayesian Inverse Problems: Beyond Gaussian Priors
Well-posed Bayesian Inverse Problems: Beyond Gaussian Priors Bamdad Hosseini Department of Mathematics Simon Fraser University, Canada The Institute for Computational Engineering and Sciences University
More informationDenosing Using Wavelets and Projections onto the l 1 -Ball
1 Denosing Using Wavelets and Projections onto the l 1 -Ball October 6, 2014 A. Enis Cetin, M. Tofighi Dept. of Electrical and Electronic Engineering, Bilkent University, Ankara, Turkey cetin@bilkent.edu.tr,
More informationIntroduction to Sparsity in Signal Processing
1 Introduction to Sparsity in Signal Processing Ivan Selesnick Polytechnic Institute of New York University Brooklyn, New York selesi@poly.edu 212 2 Under-determined linear equations Consider a system
More informationGaussian Processes for Audio Feature Extraction
Gaussian Processes for Audio Feature Extraction Dr. Richard E. Turner (ret26@cam.ac.uk) Computational and Biological Learning Lab Department of Engineering University of Cambridge Machine hearing pipeline
More informationSystem Identification, Lecture 4
System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2016 F, FRI Uppsala University, Information Technology 13 April 2016 SI-2016 K. Pelckmans
More informationThe Analysis Cosparse Model for Signals and Images
The Analysis Cosparse Model for Signals and Images Raja Giryes Computer Science Department, Technion. The research leading to these results has received funding from the European Research Council under
More informationApplication of Wavelets to N body Particle In Cell Simulations
Application of Wavelets to N body Particle In Cell Simulations Balša Terzić (NIU) Northern Illinois University, Department of Physics April 7, 2006 Motivation Gaining insight into the dynamics of multi
More informationMultiresolution Models of Time Series
Multiresolution Models of Time Series Andrea Tamoni (Bocconi University ) 2011 Tamoni Multiresolution Models of Time Series 1/ 16 General Framework Time-scale decomposition General Framework Begin with
More informationEE123 Digital Signal Processing
EE123 Digital Signal Processing Lecture 12 Introduction to Wavelets Last Time Started with STFT Heisenberg Boxes Continue and move to wavelets Ham exam -- see Piazza post Please register at www.eastbayarc.org/form605.htm
More informationAn Overview of Sparsity with Applications to Compression, Restoration, and Inverse Problems
An Overview of Sparsity with Applications to Compression, Restoration, and Inverse Problems Justin Romberg Georgia Tech, School of ECE ENS Winter School January 9, 2012 Lyon, France Applied and Computational
More informationAN INTRODUCTION TO COMPRESSIVE SENSING
AN INTRODUCTION TO COMPRESSIVE SENSING Rodrigo B. Platte School of Mathematical and Statistical Sciences APM/EEE598 Reverse Engineering of Complex Dynamical Networks OUTLINE 1 INTRODUCTION 2 INCOHERENCE
More informationStochastic Numerical Analysis
Stochastic Numerical Analysis Prof. Mike Giles mike.giles@maths.ox.ac.uk Oxford University Mathematical Institute Stoch. NA, Lecture 3 p. 1 Multi-dimensional SDEs So far we have considered scalar SDEs
More informationGeneralized Orthogonal Matching Pursuit- A Review and Some
Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents
More informationContinuous State MRF s
EE64 Digital Image Processing II: Purdue University VISE - December 4, Continuous State MRF s Topics to be covered: Quadratic functions Non-Convex functions Continuous MAP estimation Convex functions EE64
More informationRecent developments on sparse representation
Recent developments on sparse representation Zeng Tieyong Department of Mathematics, Hong Kong Baptist University Email: zeng@hkbu.edu.hk Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last
More informationCompressed Sensing in Astronomy
Compressed Sensing in Astronomy J.-L. Starck CEA, IRFU, Service d'astrophysique, France jstarck@cea.fr http://jstarck.free.fr Collaborators: J. Bobin, CEA. Introduction: Compressed Sensing (CS) Sparse
More informationDenoising of NIRS Measured Biomedical Signals
Denoising of NIRS Measured Biomedical Signals Y V Rami reddy 1, Dr.D.VishnuVardhan 2 1 M. Tech, Dept of ECE, JNTUA College of Engineering, Pulivendula, A.P, India 2 Assistant Professor, Dept of ECE, JNTUA
More informationSystem Identification, Lecture 4
System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2012 F, FRI Uppsala University, Information Technology 30 Januari 2012 SI-2012 K. Pelckmans
More informationGaussian processes for inference in stochastic differential equations
Gaussian processes for inference in stochastic differential equations Manfred Opper, AI group, TU Berlin November 6, 2017 Manfred Opper, AI group, TU Berlin (TU Berlin) inference in SDE November 6, 2017
More informationScale Mixture Modeling of Priors for Sparse Signal Recovery
Scale Mixture Modeling of Priors for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Jason Palmer, Zhilin Zhang and Ritwik Giri Outline Outline Sparse
More informationEstimation of the Optimum Rotational Parameter for the Fractional Fourier Transform Using Domain Decomposition
Estimation of the Optimum Rotational Parameter for the Fractional Fourier Transform Using Domain Decomposition Seema Sud 1 1 The Aerospace Corporation, 4851 Stonecroft Blvd. Chantilly, VA 20151 Abstract
More informationCOMPLEX WAVELET TRANSFORM IN SIGNAL AND IMAGE ANALYSIS
COMPLEX WAVELET TRANSFORM IN SIGNAL AND IMAGE ANALYSIS MUSOKO VICTOR, PROCHÁZKA ALEŠ Institute of Chemical Technology, Department of Computing and Control Engineering Technická 905, 66 8 Prague 6, Cech
More informationInverse problems, Dictionary based Signal Models and Compressed Sensing
Inverse problems, Dictionary based Signal Models and Compressed Sensing Rémi Gribonval METISS project-team (audio signal processing, speech recognition, source separation) INRIA, Rennes, France Ecole d
More informationAn Introduction to Sparse Representations and Compressive Sensing. Part I
An Introduction to Sparse Representations and Compressive Sensing Part I Paulo Gonçalves CPE Lyon - 4ETI - Cours Semi-Optionnel Méthodes Avancées pour le Traitement des Signaux 2014 Objectifs Part I The
More informationMassoud BABAIE-ZADEH. Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39
Blind Source Separation (BSS) and Independent Componen Analysis (ICA) Massoud BABAIE-ZADEH Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39 Outline Part I Part II Introduction
More informationImage Denoising using Uniform Curvelet Transform and Complex Gaussian Scale Mixture
EE 5359 Multimedia Processing Project Report Image Denoising using Uniform Curvelet Transform and Complex Gaussian Scale Mixture By An Vo ISTRUCTOR: Dr. K. R. Rao Summer 008 Image Denoising using Uniform
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationRough paths methods 1: Introduction
Rough paths methods 1: Introduction Samy Tindel Purdue University University of Aarhus 216 Samy T. (Purdue) Rough Paths 1 Aarhus 216 1 / 16 Outline 1 Motivations for rough paths techniques 2 Summary of
More informationSparse linear models and denoising
Lecture notes 4 February 22, 2016 Sparse linear models and denoising 1 Introduction 1.1 Definition and motivation Finding representations of signals that allow to process them more effectively is a central
More informationMULTI-SCALE IMAGE DENOISING BASED ON GOODNESS OF FIT (GOF) TESTS
MULTI-SCALE IMAGE DENOISING BASED ON GOODNESS OF FIT (GOF) TESTS Naveed ur Rehman 1, Khuram Naveed 1, Shoaib Ehsan 2, Klaus McDonald-Maier 2 1 Department of Electrical Engineering, COMSATS Institute of
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 254 Part V
More informationFRACTIONAL BROWNIAN MODELS FOR VECTOR FIELD DATA. Pouya Dehghani Tafti, Member, IEEE, and Michael Unser, Fellow, IEEE
FRACTIONAL BROWNIAN MODELS FOR VECTOR FIELD DATA Pouya Dehghani Tafti, Member, IEEE, and Michael Unser, Fellow, IEEE Laboratoire d imagerie biomédicale, Ecole Polytechnique Fédérale de Lausanne (EPFL),
More informationStability and Sensitivity of the Capacity in Continuous Channels. Malcolm Egan
Stability and Sensitivity of the Capacity in Continuous Channels Malcolm Egan Univ. Lyon, INSA Lyon, INRIA 2019 European School of Information Theory April 18, 2019 1 / 40 Capacity of Additive Noise Models
More informationLAN property for sde s with additive fractional noise and continuous time observation
LAN property for sde s with additive fractional noise and continuous time observation Eulalia Nualart (Universitat Pompeu Fabra, Barcelona) joint work with Samy Tindel (Purdue University) Vlad s 6th birthday,
More informationSparsity Regularization
Sparsity Regularization Bangti Jin Course Inverse Problems & Imaging 1 / 41 Outline 1 Motivation: sparsity? 2 Mathematical preliminaries 3 l 1 solvers 2 / 41 problem setup finite-dimensional formulation
More informationLecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations
Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Department of Biomedical Engineering and Computational Science Aalto University April 28, 2010 Contents 1 Multiple Model
More informationTopics in fractional Brownian motion
Topics in fractional Brownian motion Esko Valkeila Spring School, Jena 25.3. 2011 We plan to discuss the following items during these lectures: Fractional Brownian motion and its properties. Topics in
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More information( nonlinear constraints)
Wavelet Design & Applications Basic requirements: Admissibility (single constraint) Orthogonality ( nonlinear constraints) Sparse Representation Smooth functions well approx. by Fourier High-frequency
More informationSAMPLING DISCRETE-TIME PIECEWISE BANDLIMITED SIGNALS
SAMPLIG DISCRETE-TIME PIECEWISE BADLIMITED SIGALS Martin Vetterli,, Pina Marziliano and Thierry Blu 3 LCAV, 3 IOA, Ecole Polytechnique Fédérale de Lausanne, Switzerland EECS Dept., University of California
More informationIntroduction to Sparsity in Signal Processing
1 Introduction to Sparsity in Signal Processing Ivan Selesnick Polytechnic Institute of New York University Brooklyn, New York selesi@poly.edu 212 2 Under-determined linear equations Consider a system
More informationInformation-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery
Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery Jonathan Scarlett jonathan.scarlett@epfl.ch Laboratory for Information and Inference Systems (LIONS)
More informationEnhanced Compressive Sensing and More
Enhanced Compressive Sensing and More Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Nonlinear Approximation Techniques Using L1 Texas A & M University
More informationBayesian Paradigm. Maximum A Posteriori Estimation
Bayesian Paradigm Maximum A Posteriori Estimation Simple acquisition model noise + degradation Constraint minimization or Equivalent formulation Constraint minimization Lagrangian (unconstraint minimization)
More information