Tensor-Based Dictionary Learning for Multidimensional Sparse Recovery. Florian Römer and Giovanni Del Galdo

Similar documents
On the choice of mixing sequences for SNR improvement in Modulated Wideband Converter

Truncation Strategy of Tensor Compressive Sensing for Noisy Video Sequences

/16/$ IEEE 1728

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013

EUSIPCO

A new truncation strategy for the higher-order singular value decomposition

On the convergence of higher-order orthogonality iteration and its extension

Estimation Error Bounds for Frame Denoising

Residual Correlation Regularization Based Image Denoising

Compressed Sensing and Related Learning Problems

A Multi-task Learning Strategy for Unsupervised Clustering via Explicitly Separating the Commonality

Sparse & Redundant Signal Representation, and its Role in Image Processing

Design of Projection Matrix for Compressive Sensing by Nonsmooth Optimization

SPARSE signal representations have gained popularity in recent

arxiv: v1 [eess.iv] 7 Dec 2018

Time-Delay Estimation via CPD-GEVD Applied to Tensor-based GNSS Arrays with Errors

Parallel Tensor Compression for Large-Scale Scientific Data

The multiple-vector tensor-vector product

IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER

Sparseness Constraints on Nonnegative Tensor Decomposition

20th European Signal Processing Conference (EUSIPCO 2012) Bucharest, Romania, August 27-31, 2012

Postgraduate Course Signal Processing for Big Data (MSc)

Available Ph.D position in Big data processing using sparse tensor representations

Structured tensor missing-trace interpolation in the Hierarchical Tucker format Curt Da Silva and Felix J. Herrmann Sept. 26, 2013

A NEW FRAMEWORK FOR DESIGNING INCOHERENT SPARSIFYING DICTIONARIES

Performance Limits on the Classification of Kronecker-structured Models

Sparsifying Transform Learning for Compressed Sensing MRI

Machine Learning for Signal Processing Sparse and Overcomplete Representations

Joint multiple dictionary learning for tensor sparse coding

Tensor MUSIC in Multidimensional Sparse Arrays

Statistical approach for dictionary learning

Kronecker Product Approximation with Multiple Factor Matrices via the Tensor Product Algorithm

3D INTERPOLATION USING HANKEL TENSOR COMPLETION BY ORTHOGONAL MATCHING PURSUIT A. Adamo, P. Mazzucchelli Aresys, Milano, Italy

A Multi-Affine Model for Tensor Decomposition

Higher-Order Singular Value Decomposition (HOSVD) for structured tensors

LEARNING OVERCOMPLETE SPARSIFYING TRANSFORMS FOR SIGNAL PROCESSING. Saiprasad Ravishankar and Yoram Bresler

The Iteration-Tuned Dictionary for Sparse Representations

SHIFT-INVARIANT DICTIONARY LEARNING FOR SPARSE REPRESENTATIONS: EXTENDING K-SVD

Two-View Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix

the tensor rank is equal tor. The vectorsf (r)

SIGNAL SEPARATION USING RE-WEIGHTED AND ADAPTIVE MORPHOLOGICAL COMPONENT ANALYSIS

Self-Calibration and Biconvex Compressive Sensing

c 2008 Society for Industrial and Applied Mathematics

VIDEO CODING USING A SELF-ADAPTIVE REDUNDANT DICTIONARY CONSISTING OF SPATIAL AND TEMPORAL PREDICTION CANDIDATES. Author 1 and Author 2

Computational Linear Algebra

Overview. Optimization-Based Data Analysis. Carlos Fernandez-Granda

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

An Introduction to Hierachical (H ) Rank and TT Rank of Tensors with Examples

vmmlib Tensor Approximation Classes Susanne K. Suter April, 2013

Generalized Orthogonal Matching Pursuit- A Review and Some

c 2008 Society for Industrial and Applied Mathematics

Multi-Linear Mappings, SVD, HOSVD, and the Numerical Solution of Ill-Conditioned Tensor Least Squares Problems

Multidimensional Sinusoidal Frequency Estimation Using Subspace and Projection Separation Approaches

DNNs for Sparse Coding and Dictionary Learning

Faloutsos, Tong ICDE, 2009

HOSVD Based Image Processing Techniques

Low-rank Promoting Transformations and Tensor Interpolation - Applications to Seismic Data Denoising

A randomized block sampling approach to the canonical polyadic decomposition of large-scale tensors

N-mode Analysis (Tensor Framework) Behrouz Saghafi

Denoising and Completion of 3D Data via Multidimensional Dictionary Learning

Window-based Tensor Analysis on High-dimensional and Multi-aspect Streams

Mathematics and Computer Science

COMPRESSIVE SENSING WITH AN OVERCOMPLETE DICTIONARY FOR HIGH-RESOLUTION DFT ANALYSIS. Guglielmo Frigo and Claudio Narduzzi

Matrix-Tensor and Deep Learning in High Dimensional Data Analysis

Topographic Dictionary Learning with Structured Sparsity

Numerical tensor methods and their applications

Double Sparsity: Learning Sparse Dictionaries for Sparse Signal Approximation

sparse and low-rank tensor recovery Cubic-Sketching

Recent developments on sparse representation

arxiv: v1 [cs.it] 26 Sep 2018

Signal Recovery, Uncertainty Relations, and Minkowski Dimension

A BLIND SPARSE APPROACH FOR ESTIMATING CONSTRAINT MATRICES IN PARALIND DATA MODELS

Non-Negative Tensor Factorisation for Sound Source Separation

EFFICIENT LEARNING OF DICTIONARIES WITH LOW-RANK ATOMS

Improving the Incoherence of a Learned Dictionary via Rank Shrinkage

Learning an Adaptive Dictionary Structure for Efficient Image Sparse Coding

DATA-DRIVEN TIGHT FRAMES OF KRONECKER TYPE AND APPLICATIONS TO SEISMIC DATA INTERPOLATION AND DENOISING

Wafer Pattern Recognition Using Tucker Decomposition

Dictionary Learning for L1-Exact Sparse Coding

SPARSE decomposition of signals based on some basis. Learning Overcomplete Dictionaries Based on Atom-by-Atom Updating

A Generalized Uncertainty Principle and Sparse Representation in Pairs of Bases

A CORRELATION TENSOR BASED MODEL FOR TIME VARIANT FREQUENCY SELECTIVE MIMO CHANNELS. Martin Weis, Giovanni Del Galdo, and Martin Haardt

Tensor Canonical Correlation Analysis and Its applications

Tensor Analysis. Topics in Data Mining Fall Bruno Ribeiro

SEQUENTIAL SUBSPACE FINDING: A NEW ALGORITHM FOR LEARNING LOW-DIMENSIONAL LINEAR SUBSPACES.

Structured Matrices and Solving Multivariate Polynomial Equations

Sparse Sensing in Colocated MIMO Radar: A Matrix Completion Approach

Learning Sparsifying Transforms

STRUCTURE-AWARE DICTIONARY LEARNING WITH HARMONIC ATOMS

Pascal Lab. I3S R-PC

Multi-Way Compressed Sensing for Big Tensor Data

Provable Alternating Minimization Methods for Non-convex Optimization

Joint Sensing Matrix and Sparsifying Dictionary Optimization for Tensor Compressive Sensing

Massive MIMO: Signal Structure, Efficient Processing, and Open Problems II

Rémi Gribonval - PANAMA Project-team INRIA Rennes - Bretagne Atlantique, France

Recovery of Compactly Supported Functions from Spectrogram Measurements via Lifting

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp

STA141C: Big Data & High Performance Statistical Computing

TENSOR LAYERS FOR COMPRESSION OF DEEP LEARNING NETWORKS. Cris Cecka Senior Research Scientist, NVIDIA GTC 2018

Higher-order tensor-based method for delayed exponential fitting

Transcription:

Tensor-Based Dictionary Learning for Multidimensional Sparse Recovery Florian Römer and Giovanni Del Galdo 2nd CoSeRa, Bonn, 17-19 Sept. 2013 Ilmenau University of Technology Institute for Information Technology (InIT) Page 1 Contact Information Prof. Giovanni Del Galdo Head of DVT Research Group, composed of Digital Broadcasting Research Laboratory, Ilmenau University of Technology Wireless Distribution Systems / Digital Broadcasting Group, Fraunhofer Institute for Integrated Circuits IIS office: +49 3677 69-4280 mobile: +49 151 62 91 01 81 e-mail: giovanni.delgaldo@tu-ilmenau.de giovanni.delgaldo@iis.fraunhofer.de Dr.-Ing. Florian Römer Digital Broadcasting Research Laboratory, Ilmenau University of Technology office: +49 3677 69 4286 e-mail: florian.roemer@tu-ilmenau.de florian.roemer@iis-extern.fraunhofer.de Mailing address: Ilmenau University of Technology, Helmholzplatz 2 98693 Ilmenau, Germany Page 2 1

Compressed Sensing Motivation allows lossless acquisition of signals at a sampling rate below Nyquist perfect knowledge of the dictionary is usually assumed in practice, the dictionary may be unknown partially (e.g., calibration, model mismatch, grid offset) completely (e.g., typical structures in image coding) State of the art [EAH00] [AEB06] two of the most popular fundamental approaches: MOD [EAH00], K-SVD [AEB06] many variations, e.g., SimCO, RLS-DA, discriminative K-SVD, Our contribution tensors formulation for separable 2-D sparse recovery problems extension of MOD and K-SVD, improved algorithms Dictionary learning K. Engan, S. O. Aase, and J. H. Husøy, Multi-frame compression: Theory and design, EURASIP Signal Process., vol. 80, no. 10, pp. 2121 2140, 2000. M. Aharon, M. Elad, and A. Bruckstein, The K-SVD: An algorithm for designing of overcomplete dictionaries for sparse representation, IEEE Trans. on Signal Processing, vol. 54, no. 11, pp. 4311 4322, 2006. Page 3 Outline Motivation Data model 2-D extension Dictionary learning algorithms MOD T-MOD K-SVD K-HOSVD Numerical Results Conclusions Page 4 2

Data model: 1-D sparse recovery problem 1-D noisy sparse recovery problem = + where observations overcomplete dictionary sparse coefficient matrix additive measurement noise M < N < T 2-D sparse recovery problem if the manifold is separable (e.g., 2-D DOAE with a uniform rectangular array or joint DOA/DOD) and a separable 2-D sampling grid is chosen, then Page 5 Multilinear structure The 2-D model possesses an interesting multilinear structure M 1 x N 1 M 2 x N 2 M 1 x M 2 x T 1 2 N 1 x N 2 x T M x T N x T Tucker-2 decomposition with a sparse core tensor Page 6 3

Outline Motivation Data model 2-D extension Dictionary learning algorithms MOD T-MOD K-SVD K-HOSVD Numerical Results Conclusions Page 7 Dictionary Learning via MOD Alternating Optimization ( Method of optimal directions (MOD) ) = + Given A Find S noisy sparse recovery problem Given S Find A linear least squares problem Page 8 4

MOD in 2-D: The Tensor-MOD procedure Alternating Optimization ( Method of optimal directions (MOD) ) 1 2 Given A 1, A 2 Find S Given A 2, S Find A 1 Given S, A 1 Find A 2 noisy sparse recovery problem linear least squares problem linear least squares problem Page 9 Dictionary Learning via the K-SVD Joint Optimization via the K-SVD = + + instead of iterating between A and S, iterate between atoms and update the m-th column of A and the m-th row of S jointly rank-one approximation problem: closed-form solution given by truncated SVD of Y n Page 10 5

K-SVD in 2-D: The K Higher-Order SVD (K-HOSVD) Extending the K-SVD idea to tensors 1 2 = + + we have N = N 1 N 2 atoms to optimize over rank-one tensor approximation problem Eckart-Young theorem does not generalize to tensors LS-optimal: no closed-form solution, iterative scheme required [LMV00] truncated Higher-Order SVD: closed-form, very close to LS-optimal [LMV00] L. de Lathauwer, B. de Moor, and J. Vandewalle, On the best rank-1 and rank-(r 1, r 2,, r n ) approximation of higher-order tensors, SIAM J. Matrix Anal. Appl., vol. 21, p. 1324-1342, 2000. Page 11 Outline Motivation Data model 2-D extension Dictionary learning algorithms MOD T-MOD K-SVD K-HOSVD Numerical Results Conclusions Page 12 6

Random dictionary, known support Page 13 Random dictionary, known support Page 14 7

DCT dictionary, known support Page 15 DCT dictionary, known support Page 16 8

DCT dictionary, known support P E = 10-1 Page 17 DCT dictionary, known support P E = 10-1 Page 18 9

Random dictionary, estimated support T = 300 Page 19 Random dictionary, estimated support P E = 0.1 Page 20 10

Conclusions 2-D sparse recovery problems on a separable manifold Dictionary has a Kronecker structure observation model can be expressed in tensor form Tucker-2 with sparse core dictionary learning algorithms tensor structure can be efficiently exploited improved estimation accuracy Demonstrated using two prominent examples Method of Optimal Direction (MOD) Tensor-MOD K-SVD K-Higher Order SVD improved accuracy shown numerically Page 21 11