Multichannel Deconvolution of Layered Media Using MCMC methods

Similar documents
System Identification and Adaptive Filtering in the Short-Time Fourier Transform Domain

Wavelet Based Image Restoration Using Cross-Band Operators

New Statistical Model for the Enhancement of Noisy Speech

BERNOULLI-GAUSSIAN SPECTRAL ANALYSIS OF UNEVENLY SPACED ASTROPHYSICAL DATA. Sébastien Bourguignon, Hervé Carfantan

Minimum entropy deconvolution with frequency-domain constraints

Hidden Markov Modelling

Example: Ground Motion Attenuation

Post-stack inversion of the Hussar low frequency seismic data

LPA-ICI Applications in Image Processing

Timing Recovery at Low SNR Cramer-Rao bound, and outperforming the PLL

2 Statistical Estimation: Basic Concepts

State-Space Methods for Inferring Spike Trains from Calcium Imaging

Research Article A Seismic Blind Deconvolution Algorithm Based on Bayesian Compressive Sensing

Overview of Beamforming

Bayesian inference J. Daunizeau

Covariance Estimation for High Dimensional Data Vectors

Stochastic vs Deterministic Pre-stack Inversion Methods. Brian Russell

Around the Speaker De-Identification (Speaker diarization for de-identification ++) Itshak Lapidot Moez Ajili Jean-Francois Bonastre

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems

Computer Intensive Methods in Mathematical Statistics

Markov-Switching GARCH Models and Applications to Digital Processing of Speech Signals

A Continuation Approach to Estimate a Solution Path of Mixed L2-L0 Minimization Problems

Computer Vision & Digital Image Processing

Gabor Deconvolution. Gary Margrave and Michael Lamoureux

Linear Dynamical Systems (Kalman filter)

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation

Chapter 4: Continuous channel and its capacity

Robust one-step (deconvolution + integration) seismic inversion in the frequency domain Ivan Priezzhev* and Aaron Scollard, Schlumberger

MCMC Sampling for Bayesian Inference using L1-type Priors

10. Hidden Markov Models (HMM) for Speech Processing. (some slides taken from Glass and Zue course)

23855 Rock Physics Constraints on Seismic Inversion

BASICS OF DETECTION AND ESTIMATION THEORY

Statistics & Data Sciences: First Year Prelim Exam May 2018

Super-Resolution. Shai Avidan Tel-Aviv University

Blind Equalization via Particle Filtering

Expectation propagation for signal detection in flat-fading channels

Introduction to Bayesian methods in inverse problems

Lecture : Probabilistic Machine Learning

Likelihood, MLE & EM for Gaussian Mixture Clustering. Nick Duffield Texas A&M University

EM Algorithm. Expectation-maximization (EM) algorithm.

V003 How Reliable Is Statistical Wavelet Estimation?

Gauge optimization and duality

A. Dong, N. Garcia, A.M. Haimovich

Optimal Speech Enhancement Under Signal Presence Uncertainty Using Log-Spectral Amplitude Estimator

Comparison of stochastic and variational solutions to ASL fmri data analysis

Semi-Blind approaches to source separation: introduction to the special session

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Algorithmisches Lernen/Machine Learning

Attenuation of Multiples in Marine Seismic Data. Geology /23/06

Rigorous Dynamics and Consistent Estimation in Arbitrarily Conditioned Linear Systems

An example of Bayesian reasoning Consider the one-dimensional deconvolution problem with various degrees of prior information.

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics

Generative Models and Stochastic Algorithms for Population Average Estimation and Image Analysis

Detection of Anomalies in Texture Images using Multi-Resolution Features

Local discontinuity measures for 3-D seismic data

Modern Methods of Data Analysis - WS 07/08

Sparsity and Morphological Diversity in Source Separation. Jérôme Bobin IRFU/SEDI-Service d Astrophysique CEA Saclay - France

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Introduction to Machine Learning

Outline Lecture 2 2(32)

Calculation of the sun s acoustic impulse response by multidimensional

Bayesian Defect Signal Analysis

Practical Bayesian Optimization of Machine Learning. Learning Algorithms

Lecture 15: Thu Feb 28, 2019

Expectation Maximization

TRINICON: A Versatile Framework for Multichannel Blind Signal Processing

Defect Detection Using Hidden Markov Random Fields

Integrated Non-Factorized Variational Inference

SUMMARY INTRODUCTION LOCALIZED PHASE ESTIMATION

Deconvolution imaging condition for reverse-time migration

Myopic sparse image reconstruction with application to MRFM

Structured signal recovery from non-linear and heavy-tailed measurements

Chapter 20. Deep Generative Models

Bayesian Estimation of Input Output Tables for Russia

Parameter Estimation in a Moving Horizon Perspective

6.867 Machine Learning

ENGINEERING TRIPOS PART IIB: Technical Milestone Report

10. Multi-objective least squares

Chapter 7: Channel coding:convolutional codes

A Petroleum Geologist's Guide to Seismic Reflection

Image Noise: Detection, Measurement and Removal Techniques. Zhifei Zhang

Detection ASTR ASTR509 Jasper Wall Fall term. William Sealey Gosset

An Introduction to Expectation-Maximization

arxiv: v3 [physics.geo-ph] 9 Jan 2013

9 Multi-Model State Estimation

Module 1 - Signal estimation

Data-aided and blind synchronization

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

A latent variable modelling approach to the acoustic-to-articulatory mapping problem

Bayesian Decision and Bayesian Learning

Monte Carlo methods for sampling-based Stochastic Optimization

Efficient Variational Inference in Large-Scale Bayesian Compressed Sensing

Gaussian Processes for Audio Feature Extraction

State Space and Hidden Markov Models

Paul Karapanagiotidis ECO4060

CS839: Probabilistic Graphical Models. Lecture 7: Learning Fully Observed BNs. Theo Rekatsinas

Reducing The Computational Cost of Bayesian Indoor Positioning Systems

Acoustic MIMO Signal Processing

Transcription:

Multichannel Deconvolution of Layered Media Using MCMC methods Idan Ram Electrical Engineering Department Technion Israel Institute of Technology Supervisors: Prof. Israel Cohen and Prof. Shalom Raz

OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions

OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions

geophones source Introduction An acoustic wave is transmitted into the ground. The reflected energy resulting from impedance changes is measured. The observed trace can be modeled as a noisy convolution between a 2D reflectivity and an unknown wavelet. seismic trace Deconvolution is used to remove the effect of the wavelet

Problem Formulation.3 seismic wavelet The seismic trace Y is modeled as the convolution: Y=h*R+W. h - unknown D seismic wavelet, invariant in both horizontal and vertical directions. R - 2D reflectivity section, consist of continuous, smooth and mostly horizontal layer boundaries. W - white Gaussian noise independent 2 from R with zero mean and variance. σ w.2. -. -.2 -.3 -.4 -.5 -.6 5 5 2 25 5 reflectivity 5 2 3 4 5 6 7 8 9 The blind deconvolution problem consists in recovering the unknown seismic wavelet h and the 2D reflectivity section R from the observed seismic trace Y.

Goals Previous works: Blind Marine Seismic Deconvolution Using Statistical MCMC Methods O. Rosec, J. M. Bouceher, B. Nsiri, and T. Chonavel Multichannel seismic deconvolution Goals: J. Idier and Y. Goussard. Combine the two methods above into a blind stochastic multichannel deconvolution scheme. 2. Create a smoothing version of the proposed multichannel scheme.

OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions

Single Channel Deconvolution The multichannel deconvolution problem can be broken into independent D vertical deconvolution problems. Single channel blind deconvolution consists in recovering the D reflectivity sequence r and the wavelet h from a D observed trace y. qk ( ) rk ( ) D reflectivity In the vertical direction, a D reflectivity signal appears as a sparse spike train. The reflectivity sequence can be modeled as a Bernoulli-Gaussian (BG) process: 2 pqk ( ( ) = ) = λ, prk ( ( )) = λn(, σ ) + ( λδ ) ( rk ( )).5.5 D trace -.5-2 3 4 5 6 7 8 9

ML Parameter Estimation 2 2 The parameters θ= ( h, λσ,, σ w ) need to be estimated. This is carried out using ML estimation: θˆ = argmax ln p( y θ) This is an incomplete data problem: ML This maximization problem is solved using the stochastic expectation maximization (SEM) algorithm. θ θˆ = argmax ln p( rqy,, θ) ML θ

The Gibbs Sampler Suppose we wish to sample a random vector x=(x,,x n ) according to f(x) Gibbs Sample algorithm: A. For a given x t, generate y=(y,,y n ) as follows: ) Draw y from the conditional pdf f(x x t,2,, x t,n ). 2) Draw y i from the conditional pdf f(x i y,,y i-,x t,i+,, x t,n ). 3) Draw y n from the conditional pdf f(x n y,,y t,n- ). B. Let x t+ =y. Under mild conditions, the limiting dist. of the process {x t, t=,2, } is precisely f(x)

The Gibbs Sampler (cont.) Simulates observations of q and r from p(r,q y) The algorithm samples from: p( r( k), q( k) yq,, r ) ~ Bi( λ ) NmV (, ) Gibbs Sampler algorithm:. Initialization: choice of q () and r (). 2. For i I and for k=, N r Detection step: compute λ k =p(q(k)= y,q -k,r -k ) simulate q (i) (k)~ Bi(λ k ) Estimation step: k k k q () () = q () (2) = q () (3) = q () (4) = simulate r (i) (k) ~N(m,V ) if q (i) (k)=, otherwise r (i) (k)= ( ( ) ( ) q, r ) r () () = r () (2) = r () (3) = r () (4) = 2 ( ( ) ( ) q, r )

The SEM algorithm The SEM algorithm follows the steps:. Initialization: choice of r (),q (),θ () 2. For i=,,i: 3. θˆ E step: simulation of r (i),q (i) by the Gibbs sampler according to p(r,q y,θ (i-) ) M step: parameters estimation: ˆ () i () i () i argmax (,, ) θ = p r q y θ θ I = θ I I i= I + () i

The Deconvolution Process Map estimation: This maximization problem can be solved in two steps: Detection: Estimation: rˆ = ( rq ˆ, ˆ) = argmax p( rq, y) N The detection problem is hard because q has discrete configurations. A simpler criterion called maximum posterior mode (MPM) which maximizes p(r(k),q(k) y) is used instead. rq, qˆ = argmax p( q y) q argmax p( r yq, ˆ) r 2 r

The MPM algorithm The MPM algorithm follows the steps:. For i=,,i simulate (r (i), q (i) ) using the Gibbs sampler 2. For i=,,n r detection step: I () i if q ( k) >.5 qk ˆ( ) = I I i= I + otherwise estimation step: rk ˆ( ) = I () i () i q kr k i= I + ˆ = I i= I + ( ) ( ) q () i ( k), if qk ( ), otherwise

The MPM algorithm (cont.) () q (2) q (3) q (4) q (5) q () r 4-2 -2.5 4 (2) r 4.5-2 (3) r 4 (4) r 3.5 -.5 (5) r ˆq 4-2 ˆr

Reflectivity Post Processing Fuse and replace by their gravity center:. 2 successive impulses 2. 2 impulses separated by one sample ( rˆ, q ˆ ) ( rˆ, qˆ )

Advantages And Limitations Advantages:. Estimates the wavelet and BG model s missing parameters. 2. Produces good estimates for single channel traces. 3. Uses stochastic methods which are less likely to converge to local maximum points. Limitations. Does not account for the medium s stratified structure in the deconvolution process.

OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions

MBG-I Model Exact 2D extension of the BG representation. Comprised of:. MBRF p(t,q) models the geometric properties of the reflectivity using location and transition variables. 2. p( R TQ, ) - white Gaussian reflectivity amplitude model, defined conditionally to the MBRF. (,) Boundary representation Location variables q k, q k, t t k, / k, / t k, t k, q k, q k, + q k, + t \ k, \ t k, Transition variables q k +, q k +, + k

Markov Bernoulli Random Field Characteristics MBRF characteristics:. 2. 3. 4. 5. Amplitude field characteristics:. 2. pt (, t, t ) = pt ( ) pt ( ) pt ( ) k, k, k, k, k, k, q ~ Bi( λ ), t ~ Bi( µ ), t ~ Bi( µ ), t ~ Bi( µ ) / / \ \ k, k, k, k, λ = ( µ )( µ )( µ )( ε) p( t =, t =, t = q = ) = r k, = k, k, k, k, pq ( = t =, t =, t = ) = ε k, k, k, k+, r ~ N(, σ ) 2 k, 3. r = ar + w, w ~ N(, ( a ) σ ) 2 2 k, k+ d, k r r

Deconvolution Scheme MAP estimation: Computing the exact MAP solution is practically impossible. The following suboptimal recursive maximization procedure is used instead:. First column: 2 J 2. : ( rˆ, qˆ ) = argmax p( r, q, y ) r, q ( rˆ, qˆ, tˆ, tˆ, tˆ ) = RQT ˆ ˆ ˆ Tˆ Tˆ = RQT T T Y (,, /,, \ ) argmax p(,, /,, \ ) RQT,,, T, T arg,,,, r q t t t ax m p( r, q, t, t, t, y rˆ, qˆ ) Each partial criterion is maximized using a suboptimal SMLR type algorithm.

Deconvolution Scheme (cont.) q ˆ rˆ tˆ, tˆ, tˆ ˆq 2 2 tˆ tˆ tˆ ˆq ˆr 3 3 ˆr 2, 2, 2

Advantages And Limitations Advantages:. Produces good estimates of the 2D reflectivity Limitations:. Non blind. 2. Each partial criterion is maximized only with respect to / \ r, q, t, t, t. r 3. is determined based on observations only up to y. 4. The SMLR-type algorithm may converge to a local optimum. 5. The first reflectivity column is assumed to be known.

OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions

Blind Multichannel MCMC Deconvolution Uses the MBG-I reflectivity model. Uses the following suboptimal recursive maximization procedure :. First column: 2 J 2. : ( rˆ, qˆ ) = argmax p( r, q y ) ( rˆ, qˆ, tˆ, tˆ, tˆ ) = argmax p( r, q, t, t, t y, rˆ, qˆ ),,,, r q t t t The SMLR type algorithm is replaced by an extended version of the MPM algorithm which maximizes: p( t, t, t, q, r y, qˆ, rˆ ) k, k, k, k, k,

MBG I Parameters Estimation 2 2 The parameters θ= ( h, λσ, are estimated using the SEM, σ w ) algorithm. The following method is used to estimate θ = (,,,, ): MBG a µ µ µ ε. apply single channel deconvolution to each of Y s columns. 2. remove all the isolated reflectors from the obtained reflectivity section. 3. calculate: μ / = #boundary upward transitions/# samples in T / μ - = #boundary upward transitions/# samples in T - μ \ = #boundary upward transitions/# samples in T \ a = average attenuation ratio between neighboring reflectors

Multichannel Deconvolution s Gibbs Sampler Used by the extended MPM algorithm. Simulates observations of r, q, t, t, t from p( r, q, t, t, t y, rˆ, qˆ ) The algorithm samples from:. ( ) p( r ˆ ˆ k,, q,,,,,,, )~ ( ), b k, y r k, q k, r q t t t Bi λk, N mb Vb ( ) p( t t, t, t, q, r, rˆ, qˆ, y )~ Bi µ / / k, k, ( ) p( t t, t, t, q, r, rˆ, qˆ, y )~ Bi µ k, k, ( ) p( t t, t, t, q, r, rˆ, qˆ, y )~ Bi µ / \ k, k,

Multichannel Deconvolution s Gibbs Sampler s Algorithm. Initialization: choice of r, q, t, t, t () () / () () \ () 2. For i I and for k=, N r Detection step: compute simulate b λ k,, µ, µ, µ t ~ Bi( µ ), t ~ Bi( µ ), /() i / -() i k, k, \ () i \ () i b k, ~ ( µ ), k, ~ ( λk, ) t Bi q Bi transition variables t ˆ t ˆ t ˆ qˆ rˆ q ˆ ˆ,r,, Estimation step: simulation of ( ) r, ~ N m, V () i k b b () i () i if q = and r = if k, k, q = () i k,

Multichannel Deconvolution s MPM Algorithm The extended MPM algorithm follows the steps:. () i () i /() i () i \() i For i=,,i simulate r, q, t, t, t using the Gibbs sampler 2. For i=,,n r detection step: I I /() i -() i / tk, t k,, i I, ˆ I I I I k = = + tk, = i= I+ tˆ \ k, if >.5 if >.5 otherwise otherwise I \ ( i) if,.5 ˆ tk > t = I I, qˆ = i= I + otherwise k, I () i if qk, >.5 I I i= I + otherwise

Multichannel Deconvolution s MPM Algorithm (cont.) estimation step: rˆ k, = I i= I + ˆ = I i= I + q r () i () i k, k, q () i k,, if q k,, otherwise / () t / (2) t t / (3) t / (4) / (5) t ˆ/ t () r (2) (3) (4) (5) r r r r 4 4.5-2 -2.5-2 4 3.5 -.5 ˆr 4-2

Synthetic Data Results Reflectivity.3 Wavelet.2. 5 -. -.2 -.3 -.4 -.5 5 2 3 4 5 6 7 8 9 -.6 5 5 2 25 Seismic trace, SNR=5 db Seismic trace, SNR= db 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 2 3 4 5 6 7 8 9 2 3 4 5 6 7 8 9

Synthetic Results Estimated Parameters.3 Estimated wavelet (5 db).3 Estimated wavelet ( db).2.2.. -. -. -.2 -.2 -.3 -.3 -.4 -.5 real estimated -.4 -.5 real estimated -.6 5 5 2 25 -.6 5 5 2 25 λ σ / σ w a µ µ µ \ ε True.55.32.999.66.399.8.2 Estimated (5 db).533.223.94.833.8.222.96.46 Estimatd ( db).54.925.657.877.82.38.9.69

Synthetic Data Results, SNR=5 db Single channel deconvolution, SNR=5 db Multichannel deconvolution, SNR=5 db 5 5 5 2 3 4 5 6 7 8 9 5 2 3 4 5 6 7 8 9

Synthetic Data Results, SNR= db Single channel deconvolution, SNR= db Multichannel deconvolution, SNR= db 5 5 5 2 3 4 5 6 7 8 9 5 2 3 4 5 6 7 8 9

Synthetic Results Performance Quality Measures The following loss functions were used to quantify the performance of the algorithms:. 2. 3. 4. L = rˆ r + N + N L L L miss+ false miss false miss false SSQ = rˆ r + N = rˆ r + N = rˆ r 2 2 miss false miss+ false miss false 5. L2, L2, L2 give partial credit to reflectors that are close to their true positions. SNR (5dB) L miss+flase L miss L false L SSQ L 2 miss+flase L 2 miss L 2 false 5 SC 52.3 44.3 365.3 276. 293.7 235.2 86.2 MC 372.6 3.6 265.6 72.9 26. 73.6 38.6 SC 243.9 92.9 75.9 6.8 4.5 9.5 92.5 MC 96.3 49.3 45.3 75.4 23.6 92.6 89.6

Real Data Estimated Parameters Estimated wavelet.5.4.3.2. -. -.2 -.3 5 5 2 25 3 35 4 45 λ σ / σ w a µ µ µ \ ε.382 3.9488.724.8989.5.2.8.252

Real Data Results Real data Single channel deconvolution Multichannel deconvolution

OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions

Deconvolution By Smoothing Uses the MBG-I reflectivity prior model. Uses the same parameter estimation method as the first proposed algorithm. Uses a column recursive deconvolution scheme, similar to the one of the previous algorithm. Accounts for observation columns subsequent to y in r s estimation process.

Smoothing Scheme 2 4 6 8 2 2 4 6 8 2 4

Smoothing Windows qˆ 2 () q ˆ (2) (3) qˆ q qˆ 2 qˆ () (2) qˆ q

Smoothing Scheme (cont.) q q + r r + t t t q r t q r t q + r +

Deconvolution Scheme Uses the following estimation procedure: First column: ( rˆ, qˆ ) = argmax p( r, q, t, t, t y ) r q t t t,,,, Middle column: ( rˆ, qˆ, t ˆ, t ˆ, t ˆ ) = - - - argmax p( r, q, t, t, t y, qˆ, rˆ ),, -, -, - r q t t t - - - Last column: ( rˆ, qˆ, tˆ, tˆ, tˆ ) = J J J J J argmax p( r, q, t, t, t y, qˆ, rˆ ) J, J, J, J, J r q t t t J J J J J J J J Uses a further extended version of the MPM algorithm which maximizes: p( r, q, t, t, t y, qˆ, rˆ ) k, k, k, k, k,

Smoothing s Gibbs Sampler Used by the smoothing s MPM algorithm. Simulates observations of r, q, t, t, t from p( r, q, t, t, t y, rˆ, qˆ ). The algorithm samples from: p( r, q y, r, q, rˆ, r, qˆ, q, t, t, t, t, t, t k, k, m k, k, k, + + (, ) ~ Bi( λ ) N m V m m ) The multichannel Gibbs sampler s conditional distributions

Smoothing s Gibbs Sampler Algorithm. Initialization: choice of 2. For i I. for k=, N r r, q, t, t, t () () / () () \ () rˆ ˆ,q r ˆ,qˆ rˆ + ˆ,q + t, t, t t, t, t detection step: compute simulate estimation step: m λ k,, µ, µ, µ t ~ Bi( µ ), t ~ Bi( µ ), /() i / -() i k, k, t ~ Bi( µ ), q ~ Bi( λ ) \() i \ () i m k, k, k, () i r k, simulation of where () i if q k, = () i and r = if k, ( ) r, ~ N m, V () i k b b () i q k, = 2. for k+=2n r follow the multichannel Gibbs sampler procedure

Smoothing s MPM algorithm The smoothing s MPM algorithm follows the steps: () i () i /() i () i \() i. For i=,,i simulate r, q, t, t, t using the Gibbs sampler 2. For i=,,n r detection step: t t I I /() i -() i if tk >.5 if tk >.5 = I I i= I+, t k = I I i= I+ otherwise otherwise /,, k,, I I \ ( i) i if tk >.5 if,, q I I i= I + q I I k, = i I otherwise otherwise \, k, = = + () k >.5

Smoothing s MPM algorithm (cont.) estimation step: r I () i () i qk, rk, i= I +, if q k, = I = i q k, i= I +, otherwise k, ()

Smoothing s MPM algorithm (cont.) ( ) ( 2) q q ( 3) ( 4) ( 5) ( ) q q q r ˆq ˆq 4-2 -2.5-2 4 r ( 2) r 4.5-2 -2.5-2 4.5 ( 3) r 4 4 ( 4) r 3.5 -.5 3.5 -.5 ( 5) ˆr 4-2 4-2 ˆr

Synthetic Data Results, SNR=5 db Multichannel deconvolution, SNR=5 db Deconvolution by smoothing, SNR=5 db 5 5 5 2 3 4 5 6 7 8 9 5 2 3 4 5 6 7 8 9

Synthetic Data Results, SNR= db Multichannel deconvolution, SNR= db Deconvolution by smoothing, SNR= db 5 5 5 2 3 4 5 6 7 8 9 5 2 3 4 5 6 7 8 9

Synthetic Results Performance Quality Measures SNR (5dB) L miss+flase L miss L false L SSQ L 2 miss+flase L 2 miss L 2 false 5 SC 52.3 44.3 365.3 276. 293.7 235.2 86.2 MC 372.6 3.6 265.6 72.9 26. 73.6 38.6 Smoothing 35.4 252.4 227.4 36.4 87.7 5.2 25.2 SC 243.9 92.9 75.9 6.8 4.5 9.5 92.5 MC 96.3 49.3 45.3 75.4 23.6 92.6 89.6 Smoothing 53.8 4.8 4.8 4. 9.5 8.5 8.5

Real Data Results Single channel deconvolution Multichannel deconvolution Smoothing

OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions

Conclusions We presented two stochastic blind multichannel deconvolution algorithms. The proposed parameter estimation method successfully recovers the MBG I model's parameters. Both proposed algorithms produce better deconvoltuion results than the single channel blind deconvolution method. The smoothing algorithm produces better deconvoltuion results than the proposed multichannel algorithm. The performance of both blind deconvoltuion schemes improves as the SNR increases.

Future Research Replacing the MBG I model by the MBG II model. Analyzing the smoothing algorithm s performance for larger smoothing windows Using wavelet estimation methods which can estimate both wavelet s length and maximum position.