Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries

Similar documents
Efficient Adaptive Compressive Sensing Using Sparse Hierarchical Learned Dictionaries

Topographic Dictionary Learning with Structured Sparsity

Recent Advances in Structured Sparse Models

2.3. Clustering or vector quantization 57

Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach

Convex relaxation for Combinatorial Penalties

Bayesian Methods for Sparse Signal Recovery

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk

Sparse Estimation and Dictionary Learning

Compressed Sensing and Neural Networks

Low Resolution Adaptive Compressed Sensing for mmwave MIMO receivers

Sparsifying Transform Learning for Compressed Sensing MRI

Global vs. Multiscale Approaches

The Pros and Cons of Compressive Sensing

EE 381V: Large Scale Optimization Fall Lecture 24 April 11

Proximal Methods for Optimization with Spasity-inducing Norms

Structured Sparse Estimation with Network Flow Optimization

Algorithms for sparse analysis Lecture I: Background on sparse approximation

Sparse representation classification and positive L1 minimization

An Introduction to Sparse Approximation

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Sensing systems limited by constraints: physical size, time, cost, energy

Greedy Dictionary Selection for Sparse Representation

Compressed Sensing Using Reed- Solomon and Q-Ary LDPC Codes

Compressed Sensing and Related Learning Problems

Tree-Structured Compressive Sensing with Variational Bayesian Analysis

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery

Sparse & Redundant Signal Representation, and its Role in Image Processing

Structured matrix factorizations. Example: Eigenfaces

Strengthened Sobolev inequalities for a random subspace of functions

Overview. Optimization-Based Data Analysis. Carlos Fernandez-Granda

Signal Recovery from Permuted Observations

1 Sparsity and l 1 relaxation

Online Dictionary Learning with Group Structure Inducing Norms

Parcimonie en apprentissage statistique

Sparse linear models

Sparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing

Fast Hard Thresholding with Nesterov s Gradient Method

Noise Removal? The Evolution Of Pr(x) Denoising By Energy Minimization. ( x) An Introduction to Sparse Representation and the K-SVD Algorithm

Large-Scale Multiple Testing in Medical Informatics

A Generalized Restricted Isometry Property

arxiv: v1 [cs.it] 21 Feb 2013

Linear Methods for Regression. Lijun Zhang

SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD

Performance Analysis for Sparse Support Recovery

Random projection trees and low dimensional manifolds. Sanjoy Dasgupta and Yoav Freund University of California, San Diego

Structured sparsity-inducing norms through submodular functions

Wavelets and Signal Processing

Wireless Compressive Sensing for Energy Harvesting Sensor Nodes over Fading Channels

Wavelet Footprints: Theory, Algorithms, and Applications

Gauge optimization and duality

Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm

Sparse Optimization Lecture: Sparse Recovery Guarantees

CSC 576: Variants of Sparse Learning

Single-letter Characterization of Signal Estimation from Linear Measurements

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013

Motivation Sparse Signal Recovery is an interesting area with many potential applications. Methods developed for solving sparse signal recovery proble

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Enhanced Compressive Sensing and More

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN

Linear Regression. Aarti Singh. Machine Learning / Sept 27, 2010

MLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT

Sparse Gaussian Markov Random Field Mixtures for Anomaly Detection

arxiv: v1 [cs.it] 26 Oct 2018

DATA MINING AND MACHINE LEARNING

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp

Compressed Sensing: Extending CLEAN and NNLS

Sparse Solutions of an Undetermined Linear System

25 : Graphical induced structured input/output models

SPARSE signal representations have gained popularity in recent

High-dimensional Statistical Models

Compressive Sensing and Beyond

Exploiting Structure in Wavelet-Based Bayesian Compressive Sensing

Recovering overcomplete sparse representations from structured sensing

Learning-based compression

Introduction to Compressed Sensing

A Convex Approach for Designing Good Linear Embeddings. Chinmay Hegde

Lecture Notes 9: Constrained Optimization

On the Projection Matrices Influence in the Classification of Compressed Sensed ECG Signals

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University

A tutorial on sparse modeling. Outline:

On the Relationship Between Compressive Sensing and Random Sensor Arrays

Compressive sensing in the analog world

Improving Approximate Message Passing Recovery of Sparse Binary Vectors by Post Processing

LEARNING DATA TRIAGE: LINEAR DECODING WORKS FOR COMPRESSIVE MRI. Yen-Huan Li and Volkan Cevher

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Analysis of audio intercepts: Can we identify and locate the speaker?

Estimation Error Guarantees for Poisson Denoising with Sparse and Structured Dictionary Models

Lecture 3: Compressive Classification

Compressive Sensing Theory and L1-Related Optimization Algorithms

Detecting Weak but Hierarchically-Structured Patterns in Networks

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

High-dimensional Statistics

Invertible Nonlinear Dimensionality Reduction via Joint Dictionary Learning

The Dantzig Selector

Greedy Signal Recovery and Uniform Uncertainty Principles

Reweighted Laplace Prior Based Hyperspectral Compressive Sensing for Unknown Sparsity: Supplementary Material

The Iteration-Tuned Dictionary for Sparse Representations

Deep Learning: Approximation of Functions by Composition

Optimisation Combinatoire et Convexe.

Transcription:

Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries Jarvis Haupt University of Minnesota Department of Electrical and Computer Engineering Supported by

Motivation New Agile Sensing Platforms

A Host of New Agile Imaging Sensors Original 20% 40% samples samples

Overview of This Talk Fusing Adaptive Sensing and Structured Sparsity in theory and in practice

Background Sparse Inference and Adaptive Sensing

A Model for Sparsity Objects of interest are vectors x 2 R n Signal Support: S, {i : x i =0} Sparse S = k n number of nonzero signal components

A Sparse Inference Task Noisy Linear Observation Model: y = x + w R m n w N (0,I m m ) Support Recovery Goal: Obtain an (accurate) estimate b S = b S(y, ) of true support S

A Sparse Inference Task Noisy Linear Observation Model: y = x + w R m n w N (0,I m m ) Assume Sensing energy 2 F fixed: 2 F = R x i µ for all i S What conditions are necessary/sufficient for exact support recovery? (eg., such that P (S = b S) 0 as n )

Exact Support Recovery? Point sampling y = x + w (Sensing energy R = n) Necessary & Sufficient for Exact Support Recovery: r n µ const. log n R Uncompressed Sensing (Donoho & Jin 2004; JH, Castro, & Nowak 2010) Compressed Sensing (Genovese, Jin, & Wasserman 2009; Aeron, Saligrama & Zhao, 2010)

Conditions for Exact Support Recovery Uncompressed / compressed r n µ const. log n R (N + S) Uncompressed Sensing (Donoho & Jin 2004; JH, Castro, & Nowak 2010) Compressed Sensing (Genovese, Jin, & Wasserman 2009; Aeron, Saligrama & Zhao, 2010) Question: Can we do better by exploiting structure, or adaptivity, or both?

Conditions for Exact Support Recovery r n µ const. log n R r n µ const. log k R Uncompressed/ Compressed (N + S) Necessity: (Castro 2012) Sufficiency (uncompressed): (Malloy & Nowak, 2010; Malloy & Nowak, 2011) Sufficiency (compressed): (JH, Baraniuk, Castro, & Nowak 2012, Malloy & Nowak 2013) y = x + w R m n w N (0,I m m ) k k 2 F = R

Beyond Simple Sparsity The Role of Structure

Our Focus: Tree Sparsity Tree T with n nodes and degree d Characteristics of tree structure: Elements of x in one-to-one correspondence with nodes of T Nonzeros of tree-sparse vector form rooted connected subtree of T Question: Does tree structure help in support recovery?

Conditions for Exact Support Recovery Detection of simple trail (uncompressed sensing) r n µ const. log n R r n µ const. R (N+S)* r n µ const. log k R Signal Detection Problem: (Arias-Castro, Candes, Helgason, & Zeitouni 2008)

Conditions for Exact Support Recovery The intersection of adaptivity and (tree) structure... r n µ const. log n R r n µ const. R r n µ const. log k R s k µ const. log k R (S) (A. Soni & JH, 2011) http://arxiv.org/pdf/1111.6923.pdf Akshay Soni University of Minnesota Recent related work: Adaptivity and Structure in finding activated blocks in a matrix (Balakrishnan, Kolar, Rinaldo, and Singh 2012)

Adaptive Tree Sensing: An Example 1 2 5 If the hypothesis test is correct at each step, then m = dk +1=O(k) 3 4 6 7 y (1) = x 1 + w (1) suppose y (1) > Adaptive Wavelet Tree Sensing in the Literature: Non-Fourier encoded MRI (Panych & Jolesz, 1994) Compressive Imaging (Deutsch, Averbuch, & Dekel, 2009) y (2) = x 2 + w (2) suppose y (2) < (none analyzed the case of noisy measurements...) y (3) = x 5 + w (3) y (4) = x 6 + w (4) y (5) = x 7 + w (5) suppose y (3) > suppose y (4) < suppose y (5) <

Orthogonal Dictionaries and Tree Sparsity Consider signals z R p that are sparse in a known dictionary D R p n. That is, z = Dx, where x R n is k-sparse, D satisfies D T D = I n n, and columns of D are d j, j =1, 2,...,n We are interested in the case where x is tree-sparse... Collect (noisy) observations of z by projecting onto (scaled) columns of D. Suppose, for example, that the j-th measurement is obtained by projecting onto columnn d i, then y (j) = d T i z + w (j) where w (j) N (0, 1). Nonnegative scaling factor (equivalently, could consider non-unit noise variance)

Support Recovery via Adaptive Tree Sensing Theorem (A. Soni & JH, 2011) Let T n,d be a balanced, rooted connected tree of degree d with n nodes. Suppose that z R p can be expressed as z = Dx, where D is a known dictionary with orthonormal columns and x is k-sparse. Further, suppose the support of x corresponds to a rooted connected subtree of T n,d. Observations of z are of the form of projections of z onto columns of D. Let the index corresponding to the root of T n,d be known, and apply the top-down tree sensing procedure with threshold and scaling parameter. For any c 1 > 0 and c 2 (0, 1), there exists a constant c 3 > 0 such that if µ =min i S x i p c 3 2 log k and = c 2 µ, the tree sensing procedure collects m = dk+1 measurements, and produces a support estimate b S that equals S with probability at least 1 k c 1. q Choose = R (d+1)k, then the theorem guarantees exact support recovery (whp) when µ s c 3 (d + 1) k log k R

Conditions for Exact Support Recovery The intersection of adaptivity and (tree) structure... r n µ const. log n R r n µ const. R (*conjecture for support recovery) r n µ const. log k R s k µ const. log k R (S) (A. Soni & JH, 2011) http://arxiv.org/pdf/1111.6923.pdf Akshay Soni University of Minnesota

LASeR Learning Adaptive Sensing Representations

Beyond Wavelet Trees: Learned Representations Given training data Z R p q, want to learn a dictionary D so that Z DX, D R p n, X R n q, and each column of X, x i R n, is tree-sparse in T n,d. Pose this as an optimization: {D, X} = arg min D R p n,d T D=I n n, {x i } qx z i Dx i 2 2 + (x i ) i=1 The regularization term is (x i )= P g G g (x i ) g, where G denotes a set of (overlapping) groups of indices for x, (x i ) g is x i restricted to the indices in the group g G, g are non-negative weights, and the norm can be, eg., 2 or Solve by alternating minimization over D and X (Jenatton, Mairal, Obozinski, & Bach, 2010) Sparse Modeling Software (SPAMS): http://spams-devel.gforge.inria.fr/

Group Specifications to Enforce Tree Structure Example: Binary Tree, 15 nodes, 4 levels... 1 2 9 3 6 10 13 4 5 7 8 11 12 14 15 Number of groups same as number of nodes (but varying sizes)

LASeR An Illustrative Example

Learning Adaptive Sensing Representations Training' Data' Structured' Sparsity' LASeR' Adap3ve' Sensing' Learn representation for 163 images from Psychological Image Collection at Stirling (PICS) http://pics.psych.stir.ac.uk/ LASeR:'Learning'Adap3ve'Sensing'Representa3ons' Example images (128 128)

Learned Orthogonal Tree-Basis Elements (First four levels of 7 total)

Original Image Tree Elements Present in Sparse Representation

Qualitative Results Sensing Energy R = 128 128 Wavelet Tree Sensing m = 20 m = 50 m = 80 PCA CS LASSO CS Tree LASSO LASeR

Qualitative Results Sensing Energy R = 128 128 32 Wavelet Tree Sensing m = 20 m = 50 m = 80 PCA CS LASSO CS Tree LASSO LASeR

Quantitative Results Reconstruction SNR(dB) 35 30 25 20 15 10 5 0 0 20 40 60 80 100 120 140 #projections Reconstruction SNR(dB) 25 20 15 10 5 0 0 20 40 60 80 100 120 140 #projections Reconstruction SNR(dB) 18 16 14 12 10 8 6 4 2 0 0 20 40 60 80 100 120 140 #projections

LASeR Imaging via Patch-wise Sensing

Patch-wise Sensing Experiment Motivated by EO Imaging Application (Thanks: Bob Muise @ Lockheed Martin) Training Data: 3 Sample images from the Columbus Large Image Format (CLIF) 2007 Dataset Each image is 1024x1024 Randomly extracted 3000 32x32 patches (at random locations) and vectorized them into length 1024 vectors Applied PCA and LASeR (7-level 127 node binary tree) to this training data

Compare: PCA Basis Elements In the tree-sensing context, can view PCA sensing approach in terms of a tree of degree 1

Learned Orthogonal Tree-Basis Elements

Example: Approximation by Patch-wise Sensing Test Image (another image from CLIF database) Sense & reconstruct non-overlapping 32x32 patches comparing LASeR, PCA, Wavelets

Approximation Results Uniform Sampling Rate Sampling rate: 12.5% LASeR rsnr = 16.5 db PCA rsnr = 17.6 db rsnr, 20 log 10 (kbx xk F /kxk F )

Approximation Results Uniform Sampling Rate Sampling rate: 12.5% LASeR rsnr = 16.5 db rsnr, 20 log 10 (kbx xk F /kxk F )

Approximation Results Uniform Sampling Rate Sampling rate: 12.5% PCA rsnr = 17.6 db rsnr, 20 log 10 (kbx xk F /kxk F )

Approximation Results Uniform Sampling Rate Sampling rate: 12.5% LASeR rsnr = 16.5 db 2D Haar Wavelet rsnr = 13.5 db rsnr, 20 log 10 (kbx xk F /kxk F )

Approximation Results Adaptive Sampling Rate Average sampling rate: 7.2% LASeR rsnr = 13.9 db PCA rsnr = 15.0 db rsnr, 20 log 10 (kbx xk F /kxk F )

Approximation Results Adaptive Sampling Rate Average sampling rate: 7.2% LASeR rsnr = 13.9 db 2D Haar Wavelet rsnr = 11.9 db rsnr, 20 log 10 (kbx xk F /kxk F )

Approximation: Zoomed In A Closer Look... (Average sampling rate: 7.2%) Original LASeR PCA 2D Haar

Approximation: Zoomed In A Closer Look... (Average sampling rate: 7.2%) Original LASeR PCA 2D Haar

Sampling Rate Adapts to Block Complexity Sampling rate per block (Average sampling rate: 7.2%) LASeR PCA

Sampling Rate Adapts to Block Complexity Sampling rate per block (Average sampling rate: 7.2%) LASeR 2D Haar

Summary: Adaptivity + Structure In Theory (Support Recovery) r n µ const. log n R r n µ const. R (conj.) r n µ const. log k R s k µ const. log k R Conclusions: Polynomial reduction in SNR required for exact support recovery (for fixed sensing energy )

Summary: Adaptivity + Structure In Practice (Learned Representations and Patch-wise Sensing) Original LASeR PCA Conclusions: PCA works very well on small patch sizes (shared, elemental structure) in noise free settings

Summary: Adaptivity + Structure m = 20 m = 50 m = 80 Wavelet Tree Sensing PCA Original Image LASeR Conclusions: Potential benefit for learned representations depend on patch size, data regularity, noise. Use LASeR on PCA residuals? (ie, fuse PCA and LASeR) www.ece.umn.edu/ jdhaupt jdhaupt@umn.edu