Random Coding for Fast Forward Modeling

Similar documents
Solving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming)

Sparse Legendre expansions via l 1 minimization

Compressed Sensing: Extending CLEAN and NNLS

A Survey of Compressive Sensing and Applications

Compressive Sensing Applied to Full-wave Form Inversion

Analog-to-Information Conversion

arxiv: v1 [cs.it] 1 Sep 2011

Sparse Sensing in Colocated MIMO Radar: A Matrix Completion Approach

Compressed Sensing and Sparse Recovery

Sparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing

Enhanced Compressive Sensing and More

Recovering overcomplete sparse representations from structured sensing

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Reconstruction from Anisotropic Random Measurements

Solution Recovery via L1 minimization: What are possible and Why?

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method

Compressed Sensing: Lecture I. Ronald DeVore

Sparse recovery for spherical harmonic expansions

Introduction to Compressed Sensing

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011

Random Methods for Linear Algebra

Fourier Methods in Array Processing

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN

Statistical Issues in Searches: Photon Science Response. Rebecca Willett, Duke University

Lecture Notes 9: Constrained Optimization

The uniform uncertainty principle and compressed sensing Harmonic analysis and related topics, Seville December 5, 2008

CSC 576: Variants of Sparse Learning

SPARSE signal representations have gained popularity in recent

Robust multichannel sparse recovery

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery

On the Observability of Linear Systems from Random, Compressive Measurements

Near Optimal Signal Recovery from Random Projections

Lecture 22: More On Compressed Sensing

Compressive Sensing Theory and L1-Related Optimization Algorithms

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk

Structured signal recovery from non-linear and heavy-tailed measurements

Optimization-based sparse recovery: Compressed sensing vs. super-resolution

IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016

CoSaMP. Iterative signal recovery from incomplete and inaccurate samples. Joel A. Tropp

A Generalized Restricted Isometry Property

Self-Calibration and Biconvex Compressive Sensing

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Lecture 3. Random Fourier measurements

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.

Compressed Sensing and Robust Recovery of Low Rank Matrices

Signal Recovery from Permuted Observations

The Analysis Cosparse Model for Signals and Images

Compressive Sensing with Random Matrices

Interpolation via weighted l 1 -minimization

Dimensionality Reduction Notes 3

Strengthened Sobolev inequalities for a random subspace of functions

Three Generalizations of Compressed Sensing

An Adaptive Sublinear Time Block Sparse Fourier Transform

Design of Spectrally Shaped Binary Sequences via Randomized Convex Relaxations

A NOVEL COMPRESSED SENSING BASED METHOD FOR SPACE TIME SIGNAL PROCESSING FOR AIR- BORNE RADARS

Compressive Sensing and Beyond

Compressed Sensing Using Reed- Solomon and Q-Ary LDPC Codes

Interpolation via weighted l 1 -minimization

New Wavelet Coefficient Raster Scannings for Deterministic Compressive Imaging

Super-resolution via Convex Programming

Sparse Recovery of Streaming Signals Using. M. Salman Asif and Justin Romberg. Abstract

MAT 585: Johnson-Lindenstrauss, Group testing, and Compressed Sensing

AN INTRODUCTION TO COMPRESSIVE SENSING

A Tutorial on Matrix Approximation by Row Sampling

ROBUST BLIND SPIKES DECONVOLUTION. Yuejie Chi. Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 43210

COMPRESSIVE SAMPLING USING EM ALGORITHM. Technical Report No: ASU/2014/4

Invertibility of random matrices

Sparse Optimization Lecture: Sparse Recovery Guarantees

Compressive sampling meets seismic imaging

Acoustic MIMO Signal Processing

Massive MIMO: Signal Structure, Efficient Processing, and Open Problems II

Lecture Notes 5: Multiresolution Analysis

CoSaMP: Greedy Signal Recovery and Uniform Uncertainty Principles

An Overview of Sparsity with Applications to Compression, Restoration, and Inverse Problems

Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector

TARGET DETECTION WITH FUNCTION OF COVARIANCE MATRICES UNDER CLUTTER ENVIRONMENT

MULTIPLE-CHANNEL DETECTION IN ACTIVE SENSING. Kaitlyn Beaudet and Douglas Cochran

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery

Pivoting. Reading: GV96 Section 3.4, Stew98 Chapter 3: 1.3

The Pros and Cons of Compressive Sensing

Towards a Mathematical Theory of Super-resolution

Signal Processing for MIMO Radars. under Gaussian and non-gaussian environments and application to STAP

Fighting the Curse of Dimensionality: Compressive Sensing in Exploration Seismology

An Introduction to Sparse Approximation

Sparsity Regularization

arxiv: v1 [cs.it] 21 Feb 2013

Performance Analysis for Strong Interference Remove of Fast Moving Target in Linear Array Antenna

Lecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH

CS on CS: Computer Science insights into Compresive Sensing (and vice versa) Piotr Indyk MIT

A Nonuniform Quantization Scheme for High Speed SAR ADC Architecture

Seismic wavefield inversion with curvelet-domain sparsity promotion

8.1 Circuit Parameters

Information and Resolution

Observability with Random Observations

Compressive Deconvolution in Random Mask Imaging

Sparse filter models for solving permutation indeterminacy in convolutive blind source separation

ACCORDING to Shannon s sampling theorem, an analog

Multidimensional Sparse Fourier Transform Based on the Fourier Projection-Slice Theorem

Transcription:

Random Coding for Fast Forward Modeling Justin Romberg with William Mantzel, Salman Asif, Karim Sabra, Ramesh Neelamani Georgia Tech, School of ECE Workshop on Sparsity and Computation June 11, 2010 Bonn, Germany J. Romberg (GaTech) Random Fast Forward Bonn 10 1 / 23

Forward simulations as acquisition problems Fast forward modeling for seismic exploration Given a candidate model of the earth, simulate a field acquisition to infer the responses between each source/receiver pair Matched field processing (matched filtering) for acoustic source localization Quickly test different source locations in a complicated environment J. Romberg (GaTech) Random Fast Forward Bonn 10 2 / 23

Seismic imaging,+.)/0#&'1)2*34'(4#2)*"4-'!"#$%#&"' ()"*"+,")-' y k p 1 p 2 p 3 5/)1%' h 1,k h 2,k h 3,k J. Romberg (GaTech) Random Fast Forward Bonn 10 3 / 23

Forward modeling/simulation Given a candidate model of the earth, we want to estimate the channel between each source/receiver pair p 1 p 4 p 2 p 3 ""!""""""""""%""""""""""".""""""""""/" #$#" h :,1 h :,2 h :,3 h :,4 1456(643*")76*8" '()*"+,-" %$&" 0*1*(2*0",*3",()9843*6"793:93" J. Romberg (GaTech) Random Fast Forward Bonn 10 4 / 23

Simultaneous activation Run a single simulation with all of the sources activated simultaneously with random waveforms The channel responses interfere with one another, but the randomness codes them in such a way that they can be separated later #"!%" ""!""""""""""%""""""""""".""""""""""/" p 1 p 2 p 3 p 4! '()*"+,-" #$#" %$&" h :,1 h :,2 h :,3 h :,4 0*1*(2*0",*3" #" 4*5167(589"!%" y 1:57(7:3*")67*;" 0*1*(2*0",*3" J. Romberg (GaTech) Random Fast Forward Bonn 10 5 / 23

Multiple channel linear algebra m y k = G 1 G 2 h G 1,k p h 2,k.!"#$$%&'()*(( &%$+,"((n!)$-)&./)$(01,"(2.&'%( p j h c,k How long does each pulse need to be to recover all of the channels? (the system is m nc, m = pulse length, c =# channels) Of course we can do it for m nc J. Romberg (GaTech) Random Fast Forward Bonn 10 6 / 23

Restricted isometries for multichannel systems m y k = G 1 G 2 h G 1,k p h 2,k.!"#$$%&'()*(( &%$+,"((n!)$-)&./)$(01,"(2.&'%( p j y k = Φh k h c,k Theorem: With each of the pulses as iid Gaussian sequences, Φ obeys when (1 δ) h 2 Φh 2 2 (1 + δ) h 2 2 s-sparse h R nc m C δ s log 5 (nc) + n Consequence: we can separate the channels using short random pulses (using l 1 min or other sparse recovery algorithms) J. Romberg (GaTech) Random Fast Forward Bonn 10 7 / 23

Multichannel theory F = DFT matrix, G i = diagonal matrices of iid Gaussians We can write and 2 F Φ Φ = 6 4 I Φ Φ = c k=1 F m Φ = F [ G 1 F G 2 F G c F ]... 3 2 G 1 G 1 G 1 G 2 G 1 Gc 3 2 F G 2 G 1 G 2 G 2 G 2 Gc 7 6 5 4 7 6.... 5 4 F G c G 1 G c G 2 G c Gc ω=1(1 g k (ω) 2 )f k,ω f k,ω + j k F... F 3 7 5 m g k (ω)g j (ω)f k,ω fj,ω ω=1 This is a sum of rank-1 matrices; can control action on s-sparse signals using tools closely related to Rudelson and Vershynin s uniform operator law of large numbers J. Romberg (GaTech) Random Fast Forward Bonn 10 8 / 23

Seismic imaging simulation (a) Simple case. (b) More complex case. Array of 128 64 (8192) sources activated simultaneously (1 receiver) Sparsity enforced in the curvelet domain urce and receiver geometry. We use 8192 (128 64) sources and 1 receiver. Figure 2: Desired band-limited Greens s functions obtained by sequential-source modeli (a) Simple case and (b) More complex case. J. Romberg (GaTech) Random Fast Forward Bonn 10 9 / 23

Seismic imaging simulation (a) Estimated (16x faster, SNR=9.6 db). (b) Estimation error (Figure 2b minus 5(a)) (c) Cross-correlation estimate. Result produced with 16 compression in the computations Figure 5: Simulation results for the more complex Green s function and the random impulsive-source Can evenapproach take this example down to 32 J. Romberg (GaTech) Random Fast Forward Bonn 10 10 / 23

MIMO channel estimation!"#$%&'()"%* -.-/*+0#$$)1* ")+)',)"%* j k Estimate all channel responses h j,k = between source j and receiver k Activating with diverse source signatures allows us to separate the cross-talk Reduces the total amount of time we spend probing the channels Other applications: underwater/wireless MIMO channel eq. MIMO radar imaging, etc. J. Romberg (GaTech) Random Fast Forward Bonn 10 11 / 23

Application: Increased field-of-view with coded apertures Architecture proposed by Marcia et al 08 J. Romberg (GaTech) Random Fast Forward Bonn 10 12 / 23

Acoustic source localization We measure the response through a known, complicated channel Source located a γ, the response is g γ Matched Field Processing: we test each location λ by correlating the measurements against a simulated response max g τ, g γ = G T g γ τ Given g τ, time reversal can be used to calculate G T g τ J. Romberg (GaTech) Random Fast Forward Bonn 10 13 / 23

Complicated channel response Complicated channel proximate locations might have totally uncorrelated responses The slice G T G γ of G T G has a main lobe at γ, then is random-looking away from γ m m J. Romberg (GaTech) Random Fast Forward Bonn 10 14 / 23

Localization model Example (1D, synthetic) slice f 0 of G T G Model this as Gaussian + side lobes Find the max using a matched filter * = With F = {shifts of a Gaussian}, we are solving arg min f F f f 0 2 2 J. Romberg (GaTech) Random Fast Forward Bonn 10 15 / 23

Multiple realizations Framework: we receive a series of measurement vectors g γ1, g γ2,..., g γl for the same environment γ i = possible different source locations Calculating Gx or G T y can be expensive: it requires the solution of a large PDE A naïve approach given observations g γi : test by calculating G1 τ = g τ for all τ on a grid; re-use calculations at each time step Idea: we can use ideas from compressive sampling to significantly reduce the amount of computation required (Think: every G1 τ is an expensive measurement...) J. Romberg (GaTech) Random Fast Forward Bonn 10 16 / 23

Coded simulations Pre-compute the responses to a dense set of randomly and simultaneously activated sources b 1 = Gφ 1, b 2 = Gφ 2,..., b K = Gφ K Given observations g γ, correlate with the b i and find the row of Φ which is closest to this set of correlations Correlate b 1, g γ b 2, g γ y =. = ΦGT g γ = Φf 0 b K, g γ J. Romberg (GaTech) Random Fast Forward Bonn 10 17 / 23

General mathematical framework Given a signal f 0 and a class of signals F, we want to find the closest point in F to f 0 ˆf1 = arg min f F f 0 f 2 2 But...we are only given y = Φf 0. We solve instead ˆf 2 = arg min f F y Φf 2 2 = arg min f F Φ(f 0 f) 2 2 If F=shifts of the same function, than this is the smashed filter (Davenport et. al 09) Q: When are the solutions ˆf 1 and ˆf 2 close to one another? A: When Φ preserves the distances between g γ and all points in F. J. Romberg (GaTech) Random Fast Forward Bonn 10 18 / 23

Preserving distances Set F 0 = F g γ Fact (BDDW 08) Suppose the for any fixed f F 0 we have P { Φf 2 2 f 2 2 > δ f 2 2} γ(δ). Then { P sup f F 0 Φf 2 2 f 2 2 } > δ f 2 2 2N δ/4 (F 0 ) γ(δ/2) γ( ) = tail bound (concentration function) N δ (F 0 ) = covering number for F 0 For subgaussian Φ, we have γ(δ) e C mδ2 Need to estimate the covering numbers... J. Romberg (GaTech) Random Fast Forward Bonn 10 19 / 23

Net of Gaussians Distance between two Gaussians, width σ, shifts τ 1, τ 2 when f(t τ 1 ) f(t τ 2 ) 2 τ 1 τ 2 2σɛ ɛ This gives us an easy estimate for the size of the best ɛ-net: N ɛ (F 0 ) T 8σɛ T = length of interval you are searching over J. Romberg (GaTech) Random Fast Forward Bonn 10 20 / 23

How many tests to guarantee accuracy? Theorem: The functions f g γ 2 and Φ(f g γ ) 2 are within δ of each other (w/ probability p) uniformly when ( ) ( ) ) T 1 K Const δ (log 2 + log + C. σ p!" #$!%&'" ()*"+#)',-./0" #1233.-4" For complicated channels we just need δ Const J. Romberg (GaTech) Random Fast Forward Bonn 10 21 / 23

Demo Grid resolution n 3000, number of coded experiments k 300 20 ambiguity function 20 coded localization (independent) 40 40 60 60 80 80 100 100 120 120 140 140 160 160 5020 5040 5060 5080 5100 5120 5020 5040 5060 5080 5100 5120 (G T g γ )(λ) (Φ T ΦG T g γ )(λ) J. Romberg (GaTech) Random Fast Forward Bonn 10 22 / 23

Summary Two problems: forward modeling for seismic imaging source localization in complicated channels Both have similar computational bottlenecks (simulations require solving large PDEs) Randomly coding the inputs allows us to push through these computational bottlenecks more efficiently Analogous to expensive acquisitions made easier with compressive sensing J. Romberg (GaTech) Random Fast Forward Bonn 10 23 / 23