Lecture 5 : Sparse Models

Similar documents
ECE295, Data Assimilation and Inverse Problems, Spring 2015

Machine Learning for Signal Processing Sparse and Overcomplete Representations

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013

An Introduction to Sparse Approximation

Sparsity in Underdetermined Systems

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011

MLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT

Bayesian Methods for Sparse Signal Recovery

Compressed Sensing: Extending CLEAN and NNLS

Sparse DOA estimation with polynomial rooting

Bhaskar Rao Department of Electrical and Computer Engineering University of California, San Diego

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

BAYESIAN SPARSE SENSING OF THE JAPANESE 2011 EARTHQUAKE

Sparsity and Compressed Sensing

Introduction to Compressed Sensing

Mathematical introduction to Compressed Sensing

SPARSE signal representations have gained popularity in recent

Generalized Orthogonal Matching Pursuit- A Review and Some

Motivation Sparse Signal Recovery is an interesting area with many potential applications. Methods developed for solving sparse signal recovery proble

EE 381V: Large Scale Optimization Fall Lecture 24 April 11

Sparse Solutions of an Undetermined Linear System

Optimization for Compressed Sensing

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.

Tractable Upper Bounds on the Restricted Isometry Constant

Sparse Approximation and Variable Selection

Sensing systems limited by constraints: physical size, time, cost, energy

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk

Part IV Compressed Sensing

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees

A Compressive Sensing Based Compressed Neural Network for Sound Source Localization

Introduction to Sparsity. Xudong Cao, Jake Dreamtree & Jerry 04/05/2012

Compressed Sensing. 1 Introduction. 2 Design of Measurement Matrices

L-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise

Phase Transition Phenomenon in Sparse Approximation

Designing Information Devices and Systems I Discussion 13B

The Pros and Cons of Compressive Sensing

Sparse and TV Kaczmarz solvers and the linearized Bregman method

Compressive Sensing and Beyond

Oslo Class 6 Sparsity based regularization

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Scale Mixture Modeling of Priors for Sparse Signal Recovery

Compressive sensing in the analog world

Lecture Notes 9: Constrained Optimization

COMPRESSED SENSING IN PYTHON

Sparse linear models

Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm

Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France

Analysis of Greedy Algorithms

SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

2D X-Ray Tomographic Reconstruction From Few Projections

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN

Lecture 16: Compressed Sensing

Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector

COMPARATIVE ANALYSIS OF ORTHOGONAL MATCHING PURSUIT AND LEAST ANGLE REGRESSION

Machine Learning A Bayesian and Optimization Perspective

Fighting the Curse of Dimensionality: Compressive Sensing in Exploration Seismology

Noisy Signal Recovery via Iterative Reweighted L1-Minimization

Invariancy of Sparse Recovery Algorithms

Pre-weighted Matching Pursuit Algorithms for Sparse Recovery

17 Random Projections and Orthogonal Matching Pursuit

Is the test error unbiased for these programs? 2017 Kevin Jamieson

New Applications of Sparse Methods in Physics. Ra Inta, Centre for Gravitational Physics, The Australian National University

Sparse Optimization Lecture: Basic Sparse Optimization Models

Lecture 3. Linear Regression II Bastian Leibe RWTH Aachen

MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications. Class 08: Sparsity Based Regularization. Lorenzo Rosasco

Compressed Sensing and Related Learning Problems

LEARNING DATA TRIAGE: LINEAR DECODING WORKS FOR COMPRESSIVE MRI. Yen-Huan Li and Volkan Cevher

Lecture 22: More On Compressed Sensing

Greedy Sparsity-Constrained Optimization

Greedy Signal Recovery and Uniform Uncertainty Principles

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp

Near Ideal Behavior of a Modified Elastic Net Algorithm in Compressed Sensing

CS598 Machine Learning in Computational Biology (Lecture 5: Matrix - part 2) Professor Jian Peng Teaching Assistant: Rongda Zhu

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Sparse Solutions of Systems of Equations and Sparse Modelling of Signals and Images

Least Sparsity of p-norm based Optimization Problems with p > 1

COMS 4721: Machine Learning for Data Science Lecture 6, 2/2/2017

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28

Exponential decay of reconstruction error from binary measurements of sparse signals

Greedy Dictionary Selection for Sparse Representation

Bayesian Paradigm. Maximum A Posteriori Estimation

Sparse Linear Models (10/7/13)

Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images!

Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm

LASSO Review, Fused LASSO, Parallel LASSO Solvers

Compressive Sensing Theory and L1-Related Optimization Algorithms

Bias-free Sparse Regression with Guaranteed Consistency

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design

Dictionary learning of sound speed profiles

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery

COMS 4771 Introduction to Machine Learning. James McInerney Adapted from slides by Nakul Verma

Sparse & Redundant Signal Representation, and its Role in Image Processing

Robustly Stable Signal Recovery in Compressed Sensing with Structured Matrix Perturbation

Signal Recovery From Incomplete and Inaccurate Measurements via Regularized Orthogonal Matching Pursuit

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 58, NO. 6, JUNE

A Tutorial on Compressive Sensing. Simon Foucart Drexel University / University of Georgia

Structured matrix factorizations. Example: Eigenfaces

IEOR 265 Lecture 3 Sparse Linear Regression

Transcription:

Lecture 5 : Sparse Models Homework 3 discussion (Nima) Sparse Models Lecture - Reading : Murphy, Chapter 13.1, 13.3, 13.6.1 - Reading : Peter Knee, Chapter 2 Paolo Gabriel (TA) : Neural Brain Control After class - Project groups (Nima) - Installation Tensorflow, Python, Jupyter (TAs)

Homework 3 : Fisher Discriminant

Sparse model Linear regression (with sparsity constraints) Slide 4 from Lecture 4

Sparse model y : measurements, A : dictionary n : noise, x : sparse weights Dictionary (A) either from physical models or learned from data (dictionary learning)

Sparse processing Linear regression (with sparsity constraints) An underdetermined system of equations has many solutions Utilizing x is sparse it can often be solved This depends on the structure of A (RIP Restricted Isometry Property) Various sparse algorithms Convex optimization (Basis pursuit / LASSO / L 1 regularization) Greedy search (Matching pursuit / OMP) Bayesian analysis (Sparse Bayesian learning / SBL) Low-dimensional understanding of high-dimensional data sets Also referred to as compressive sensing (CS)

Different applications, but the same algorithm y A x Frequency signal DFT matrix Time-signal Compressed-Image Random matrix Pixel-image Array signals Beam weight Source-location Reflection sequence Time delay Layer-reflector

135 CS approach to geophysical data analysis 90 45 CS of Earthquakes 0 CS beamforming 5 b) Sequential 10 Sequential CS 15 20 25 30 35 40 45 50 =0.05 0 180 DOA (deg) 135 40 90 35 30 25 45 20 15 0 Yao, GRL 2011, PNAS 2013 CS fathometer Yardim, JASA 2014 5 10 15 20 25 Time 30 35 40 45 Mecklenbrauker, TSP 2013 CS Sound speed estimation Bianco, JASA 2016 50 10 Xenaki, JASA 2014, 2015 Gerstoft JASA 2015 CS matched field Gemba, JASA 2016

Sparse signals /compressive signals are important We don t need to sample at the Nyquist rate Many signals are sparse, but are solved under non-sparse assumptions Beamforming Fourier transform Layered structure Inverse methods are inherently sparse: We seek the simplest way to describe the data All this requires new developments - Mathematical theory - New algorithms (interior point solvers, convex optimization) - Signal processing - New applications/demonstrations

Sparse Recovery We try to find the sparsest solution which explains our noisy measurements L 0 -norm Here, the L 0 -norm is a shorthand notation for counting the number of non-zero elements in x. 10

Sparse Recovery using L 0 -norm Underdetermined problem y = Ax, M < N Prior information x: K-sparse,K N x n kxk 0 = NX 1 xn 6=0 = K n=1 n Not really a norm: kaxk 0 = kxk 0 6= a kxk 0 There are only few sources with unknown locations and amplitudes L 0 -norm solution involves exhaustive search Combinatorial complexity, not computationally feasible

L p -norm " M % x p = $ x m p ' # & m=1 1/p for p > 0 Classic choices for p are 1, 2, and. We will misuse notation and allow also p = 0. 12

L p -norm (graphical representation) " M x p = p % x m $ ' # m=1 & 1/p

Solutions for sparse recovery Exhaustive search - L 0 regularization, not computationally feasible Convex optimization - L 1 regularization / Basis pursuit / LASSO Greedy search - Matching pursuit / Orthogonal matching pursuit (OMP) Bayesian analysis - Sparse Bayesian Learning (SBL) Regularized least squares - L 2 regularization, reference solution, not actually sparse

Slides 8/9, Lecture 4 Regularized least squares solution Solution not sparse

Basis Pursuit / LASSO / L 1 regularization The L 0 -norm minimization is not convex and requires combinatorial search making it computationally impractical We make the problem convex by substituting the L 1 -norm in place of the L 0 -norm min x This can also be formulated as x 1 subject to Ax b 2 < ε

The unconstrained -LASSO- formulation Constrained formulation of the `1-norm minimization problem: bx`1( ) =argminkxk 1 subject to ky Axk 2 apple x2c N Unconstrained formulation in the form of least squares optimization with an `1-norm regularizer: bx LASSO (µ) =argminky Axk 2 2 + µkxk 1 x2c N For every exists a µ so that the two formulations are equivalent Regularization parameter : µ

Basis Pursuit / LASSO / L 1 regularization Why is it OK to substitute the L 1 -norm for the L 0 -norm? What are the conditions such that the two problems have the same solution? min x x 1 subject to Ax b 2 < ε min x x 0 subject to Ax b 2 < ε Restricted Isometry Property (RIP) 18

Geometrical view (Figure from Bishop) L 2 regularization L 1 regularization

Regularization parameter selection The objective function of the LASSO problem: L(x,µ)=ky Axk 2 2 + µkxk 1 Regularization parameter : Sparsity depends on µ µ µ large, x = 0 µ small, non-sparse

Regularization Path (Figure from Murphy) L 2 regularization L 1 regularization 1/µ 1/µ As regularization parameter µ is decreased, more and more weights become active Thus µ controls sparsity of solutions

Applications MEG/EEG/MRI source location (earthquake location) Channel equalization Compressive sampling (beyond Nyquist sampling) Compressive camera! Beamforming Fathometer Geoacoustic inversion Sequential estimation

Beamforming / DOA estimation

Additional Resources