Multichannel Deconvolution of Layered Media Using MCMC methods
|
|
- Susan Baldwin
- 6 years ago
- Views:
Transcription
1 Multichannel Deconvolution of Layered Media Using MCMC methods Idan Ram Electrical Engineering Department Technion Israel Institute of Technology Supervisors: Prof. Israel Cohen and Prof. Shalom Raz
2 OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions
3 OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions
4 geophones source Introduction An acoustic wave is transmitted into the ground. The reflected energy resulting from impedance changes is measured. The observed trace can be modeled as a noisy convolution between a 2D reflectivity and an unknown wavelet. seismic trace Deconvolution is used to remove the effect of the wavelet
5 Problem Formulation.3 seismic wavelet The seismic trace Y is modeled as the convolution: Y=h*R+W. h - unknown D seismic wavelet, invariant in both horizontal and vertical directions. R - 2D reflectivity section, consist of continuous, smooth and mostly horizontal layer boundaries. W - white Gaussian noise independent 2 from R with zero mean and variance. σ w reflectivity The blind deconvolution problem consists in recovering the unknown seismic wavelet h and the 2D reflectivity section R from the observed seismic trace Y.
6 Goals Previous works: Blind Marine Seismic Deconvolution Using Statistical MCMC Methods O. Rosec, J. M. Bouceher, B. Nsiri, and T. Chonavel Multichannel seismic deconvolution Goals: J. Idier and Y. Goussard. Combine the two methods above into a blind stochastic multichannel deconvolution scheme. 2. Create a smoothing version of the proposed multichannel scheme.
7 OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions
8 Single Channel Deconvolution The multichannel deconvolution problem can be broken into independent D vertical deconvolution problems. Single channel blind deconvolution consists in recovering the D reflectivity sequence r and the wavelet h from a D observed trace y. qk ( ) rk ( ) D reflectivity In the vertical direction, a D reflectivity signal appears as a sparse spike train. The reflectivity sequence can be modeled as a Bernoulli-Gaussian (BG) process: 2 pqk ( ( ) = ) = λ, prk ( ( )) = λn(, σ ) + ( λδ ) ( rk ( )).5.5 D trace
9 ML Parameter Estimation 2 2 The parameters θ= ( h, λσ,, σ w ) need to be estimated. This is carried out using ML estimation: θˆ = argmax ln p( y θ) This is an incomplete data problem: ML This maximization problem is solved using the stochastic expectation maximization (SEM) algorithm. θ θˆ = argmax ln p( rqy,, θ) ML θ
10 The Gibbs Sampler Suppose we wish to sample a random vector x=(x,,x n ) according to f(x) Gibbs Sample algorithm: A. For a given x t, generate y=(y,,y n ) as follows: ) Draw y from the conditional pdf f(x x t,2,, x t,n ). 2) Draw y i from the conditional pdf f(x i y,,y i-,x t,i+,, x t,n ). 3) Draw y n from the conditional pdf f(x n y,,y t,n- ). B. Let x t+ =y. Under mild conditions, the limiting dist. of the process {x t, t=,2, } is precisely f(x)
11 The Gibbs Sampler (cont.) Simulates observations of q and r from p(r,q y) The algorithm samples from: p( r( k), q( k) yq,, r ) ~ Bi( λ ) NmV (, ) Gibbs Sampler algorithm:. Initialization: choice of q () and r (). 2. For i I and for k=, N r Detection step: compute λ k =p(q(k)= y,q -k,r -k ) simulate q (i) (k)~ Bi(λ k ) Estimation step: k k k q () () = q () (2) = q () (3) = q () (4) = simulate r (i) (k) ~N(m,V ) if q (i) (k)=, otherwise r (i) (k)= ( ( ) ( ) q, r ) r () () = r () (2) = r () (3) = r () (4) = 2 ( ( ) ( ) q, r )
12 The SEM algorithm The SEM algorithm follows the steps:. Initialization: choice of r (),q (),θ () 2. For i=,,i: 3. θˆ E step: simulation of r (i),q (i) by the Gibbs sampler according to p(r,q y,θ (i-) ) M step: parameters estimation: ˆ () i () i () i argmax (,, ) θ = p r q y θ θ I = θ I I i= I + () i
13 The Deconvolution Process Map estimation: This maximization problem can be solved in two steps: Detection: Estimation: rˆ = ( rq ˆ, ˆ) = argmax p( rq, y) N The detection problem is hard because q has discrete configurations. A simpler criterion called maximum posterior mode (MPM) which maximizes p(r(k),q(k) y) is used instead. rq, qˆ = argmax p( q y) q argmax p( r yq, ˆ) r 2 r
14 The MPM algorithm The MPM algorithm follows the steps:. For i=,,i simulate (r (i), q (i) ) using the Gibbs sampler 2. For i=,,n r detection step: I () i if q ( k) >.5 qk ˆ( ) = I I i= I + otherwise estimation step: rk ˆ( ) = I () i () i q kr k i= I + ˆ = I i= I + ( ) ( ) q () i ( k), if qk ( ), otherwise
15 The MPM algorithm (cont.) () q (2) q (3) q (4) q (5) q () r (2) r (3) r 4 (4) r (5) r ˆq 4-2 ˆr
16 Reflectivity Post Processing Fuse and replace by their gravity center:. 2 successive impulses 2. 2 impulses separated by one sample ( rˆ, q ˆ ) ( rˆ, qˆ )
17 Advantages And Limitations Advantages:. Estimates the wavelet and BG model s missing parameters. 2. Produces good estimates for single channel traces. 3. Uses stochastic methods which are less likely to converge to local maximum points. Limitations. Does not account for the medium s stratified structure in the deconvolution process.
18 OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions
19 MBG-I Model Exact 2D extension of the BG representation. Comprised of:. MBRF p(t,q) models the geometric properties of the reflectivity using location and transition variables. 2. p( R TQ, ) - white Gaussian reflectivity amplitude model, defined conditionally to the MBRF. (,) Boundary representation Location variables q k, q k, t t k, / k, / t k, t k, q k, q k, + q k, + t \ k, \ t k, Transition variables q k +, q k +, + k
20 Markov Bernoulli Random Field Characteristics MBRF characteristics: Amplitude field characteristics:. 2. pt (, t, t ) = pt ( ) pt ( ) pt ( ) k, k, k, k, k, k, q ~ Bi( λ ), t ~ Bi( µ ), t ~ Bi( µ ), t ~ Bi( µ ) / / \ \ k, k, k, k, λ = ( µ )( µ )( µ )( ε) p( t =, t =, t = q = ) = r k, = k, k, k, k, pq ( = t =, t =, t = ) = ε k, k, k, k+, r ~ N(, σ ) 2 k, 3. r = ar + w, w ~ N(, ( a ) σ ) 2 2 k, k+ d, k r r
21 Deconvolution Scheme MAP estimation: Computing the exact MAP solution is practically impossible. The following suboptimal recursive maximization procedure is used instead:. First column: 2 J 2. : ( rˆ, qˆ ) = argmax p( r, q, y ) r, q ( rˆ, qˆ, tˆ, tˆ, tˆ ) = RQT ˆ ˆ ˆ Tˆ Tˆ = RQT T T Y (,, /,, \ ) argmax p(,, /,, \ ) RQT,,, T, T arg,,,, r q t t t ax m p( r, q, t, t, t, y rˆ, qˆ ) Each partial criterion is maximized using a suboptimal SMLR type algorithm.
22 Deconvolution Scheme (cont.) q ˆ rˆ tˆ, tˆ, tˆ ˆq 2 2 tˆ tˆ tˆ ˆq ˆr 3 3 ˆr 2, 2, 2
23 Advantages And Limitations Advantages:. Produces good estimates of the 2D reflectivity Limitations:. Non blind. 2. Each partial criterion is maximized only with respect to / \ r, q, t, t, t. r 3. is determined based on observations only up to y. 4. The SMLR-type algorithm may converge to a local optimum. 5. The first reflectivity column is assumed to be known.
24 OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions
25 Blind Multichannel MCMC Deconvolution Uses the MBG-I reflectivity model. Uses the following suboptimal recursive maximization procedure :. First column: 2 J 2. : ( rˆ, qˆ ) = argmax p( r, q y ) ( rˆ, qˆ, tˆ, tˆ, tˆ ) = argmax p( r, q, t, t, t y, rˆ, qˆ ),,,, r q t t t The SMLR type algorithm is replaced by an extended version of the MPM algorithm which maximizes: p( t, t, t, q, r y, qˆ, rˆ ) k, k, k, k, k,
26 MBG I Parameters Estimation 2 2 The parameters θ= ( h, λσ, are estimated using the SEM, σ w ) algorithm. The following method is used to estimate θ = (,,,, ): MBG a µ µ µ ε. apply single channel deconvolution to each of Y s columns. 2. remove all the isolated reflectors from the obtained reflectivity section. 3. calculate: μ / = #boundary upward transitions/# samples in T / μ - = #boundary upward transitions/# samples in T - μ \ = #boundary upward transitions/# samples in T \ a = average attenuation ratio between neighboring reflectors
27 Multichannel Deconvolution s Gibbs Sampler Used by the extended MPM algorithm. Simulates observations of r, q, t, t, t from p( r, q, t, t, t y, rˆ, qˆ ) The algorithm samples from:. ( ) p( r ˆ ˆ k,, q,,,,,,, )~ ( ), b k, y r k, q k, r q t t t Bi λk, N mb Vb ( ) p( t t, t, t, q, r, rˆ, qˆ, y )~ Bi µ / / k, k, ( ) p( t t, t, t, q, r, rˆ, qˆ, y )~ Bi µ k, k, ( ) p( t t, t, t, q, r, rˆ, qˆ, y )~ Bi µ / \ k, k,
28 Multichannel Deconvolution s Gibbs Sampler s Algorithm. Initialization: choice of r, q, t, t, t () () / () () \ () 2. For i I and for k=, N r Detection step: compute simulate b λ k,, µ, µ, µ t ~ Bi( µ ), t ~ Bi( µ ), /() i / -() i k, k, \ () i \ () i b k, ~ ( µ ), k, ~ ( λk, ) t Bi q Bi transition variables t ˆ t ˆ t ˆ qˆ rˆ q ˆ ˆ,r,, Estimation step: simulation of ( ) r, ~ N m, V () i k b b () i () i if q = and r = if k, k, q = () i k,
29 Multichannel Deconvolution s MPM Algorithm The extended MPM algorithm follows the steps:. () i () i /() i () i \() i For i=,,i simulate r, q, t, t, t using the Gibbs sampler 2. For i=,,n r detection step: I I /() i -() i / tk, t k,, i I, ˆ I I I I k = = + tk, = i= I+ tˆ \ k, if >.5 if >.5 otherwise otherwise I \ ( i) if,.5 ˆ tk > t = I I, qˆ = i= I + otherwise k, I () i if qk, >.5 I I i= I + otherwise
30 Multichannel Deconvolution s MPM Algorithm (cont.) estimation step: rˆ k, = I i= I + ˆ = I i= I + q r () i () i k, k, q () i k,, if q k,, otherwise / () t / (2) t t / (3) t / (4) / (5) t ˆ/ t () r (2) (3) (4) (5) r r r r ˆr 4-2
31 Synthetic Data Results Reflectivity.3 Wavelet Seismic trace, SNR=5 db Seismic trace, SNR= db
32 Synthetic Results Estimated Parameters.3 Estimated wavelet (5 db).3 Estimated wavelet ( db) real estimated real estimated λ σ / σ w a µ µ µ \ ε True Estimated (5 db) Estimatd ( db)
33 Synthetic Data Results, SNR=5 db Single channel deconvolution, SNR=5 db Multichannel deconvolution, SNR=5 db
34 Synthetic Data Results, SNR= db Single channel deconvolution, SNR= db Multichannel deconvolution, SNR= db
35 Synthetic Results Performance Quality Measures The following loss functions were used to quantify the performance of the algorithms: L = rˆ r + N + N L L L miss+ false miss false miss false SSQ = rˆ r + N = rˆ r + N = rˆ r 2 2 miss false miss+ false miss false 5. L2, L2, L2 give partial credit to reflectors that are close to their true positions. SNR (5dB) L miss+flase L miss L false L SSQ L 2 miss+flase L 2 miss L 2 false 5 SC MC SC MC
36 Real Data Estimated Parameters Estimated wavelet λ σ / σ w a µ µ µ \ ε
37 Real Data Results Real data Single channel deconvolution Multichannel deconvolution
38 OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions
39 Deconvolution By Smoothing Uses the MBG-I reflectivity prior model. Uses the same parameter estimation method as the first proposed algorithm. Uses a column recursive deconvolution scheme, similar to the one of the previous algorithm. Accounts for observation columns subsequent to y in r s estimation process.
40 Smoothing Scheme
41 Smoothing Windows qˆ 2 () q ˆ (2) (3) qˆ q qˆ 2 qˆ () (2) qˆ q
42 Smoothing Scheme (cont.) q q + r r + t t t q r t q r t q + r +
43 Deconvolution Scheme Uses the following estimation procedure: First column: ( rˆ, qˆ ) = argmax p( r, q, t, t, t y ) r q t t t,,,, Middle column: ( rˆ, qˆ, t ˆ, t ˆ, t ˆ ) = argmax p( r, q, t, t, t y, qˆ, rˆ ),, -, -, - r q t t t Last column: ( rˆ, qˆ, tˆ, tˆ, tˆ ) = J J J J J argmax p( r, q, t, t, t y, qˆ, rˆ ) J, J, J, J, J r q t t t J J J J J J J J Uses a further extended version of the MPM algorithm which maximizes: p( r, q, t, t, t y, qˆ, rˆ ) k, k, k, k, k,
44 Smoothing s Gibbs Sampler Used by the smoothing s MPM algorithm. Simulates observations of r, q, t, t, t from p( r, q, t, t, t y, rˆ, qˆ ). The algorithm samples from: p( r, q y, r, q, rˆ, r, qˆ, q, t, t, t, t, t, t k, k, m k, k, k, + + (, ) ~ Bi( λ ) N m V m m ) The multichannel Gibbs sampler s conditional distributions
45 Smoothing s Gibbs Sampler Algorithm. Initialization: choice of 2. For i I. for k=, N r r, q, t, t, t () () / () () \ () rˆ ˆ,q r ˆ,qˆ rˆ + ˆ,q + t, t, t t, t, t detection step: compute simulate estimation step: m λ k,, µ, µ, µ t ~ Bi( µ ), t ~ Bi( µ ), /() i / -() i k, k, t ~ Bi( µ ), q ~ Bi( λ ) \() i \ () i m k, k, k, () i r k, simulation of where () i if q k, = () i and r = if k, ( ) r, ~ N m, V () i k b b () i q k, = 2. for k+=2n r follow the multichannel Gibbs sampler procedure
46 Smoothing s MPM algorithm The smoothing s MPM algorithm follows the steps: () i () i /() i () i \() i. For i=,,i simulate r, q, t, t, t using the Gibbs sampler 2. For i=,,n r detection step: t t I I /() i -() i if tk >.5 if tk >.5 = I I i= I+, t k = I I i= I+ otherwise otherwise /,, k,, I I \ ( i) i if tk >.5 if,, q I I i= I + q I I k, = i I otherwise otherwise \, k, = = + () k >.5
47 Smoothing s MPM algorithm (cont.) estimation step: r I () i () i qk, rk, i= I +, if q k, = I = i q k, i= I +, otherwise k, ()
48 Smoothing s MPM algorithm (cont.) ( ) ( 2) q q ( 3) ( 4) ( 5) ( ) q q q r ˆq ˆq r ( 2) r ( 3) r 4 4 ( 4) r ( 5) ˆr ˆr
49 Synthetic Data Results, SNR=5 db Multichannel deconvolution, SNR=5 db Deconvolution by smoothing, SNR=5 db
50 Synthetic Data Results, SNR= db Multichannel deconvolution, SNR= db Deconvolution by smoothing, SNR= db
51 Synthetic Results Performance Quality Measures SNR (5dB) L miss+flase L miss L false L SSQ L 2 miss+flase L 2 miss L 2 false 5 SC MC Smoothing SC MC Smoothing
52 Real Data Results Single channel deconvolution Multichannel deconvolution Smoothing
53 OUTLINE. Introduction 2. Blind Seismic Deconvolution Using MCMC Methods 3. Multichannel Seismic Deconvolution 4. Blind Multichannel MCMC Deconvolution 5. Deconvolution By Smoothing 6. Conclusions
54 Conclusions We presented two stochastic blind multichannel deconvolution algorithms. The proposed parameter estimation method successfully recovers the MBG I model's parameters. Both proposed algorithms produce better deconvoltuion results than the single channel blind deconvolution method. The smoothing algorithm produces better deconvoltuion results than the proposed multichannel algorithm. The performance of both blind deconvoltuion schemes improves as the SNR increases.
55 Future Research Replacing the MBG I model by the MBG II model. Analyzing the smoothing algorithm s performance for larger smoothing windows Using wavelet estimation methods which can estimate both wavelet s length and maximum position.
System Identification and Adaptive Filtering in the Short-Time Fourier Transform Domain
System Identification and Adaptive Filtering in the Short-Time Fourier Transform Domain Electrical Engineering Department Technion - Israel Institute of Technology Supervised by: Prof. Israel Cohen Outline
More informationWavelet Based Image Restoration Using Cross-Band Operators
1 Wavelet Based Image Restoration Using Cross-Band Operators Erez Cohen Electrical Engineering Department Technion - Israel Institute of Technology Supervised by Prof. Israel Cohen 2 Layout Introduction
More informationNew Statistical Model for the Enhancement of Noisy Speech
New Statistical Model for the Enhancement of Noisy Speech Electrical Engineering Department Technion - Israel Institute of Technology February 22, 27 Outline Problem Formulation and Motivation 1 Problem
More informationBERNOULLI-GAUSSIAN SPECTRAL ANALYSIS OF UNEVENLY SPACED ASTROPHYSICAL DATA. Sébastien Bourguignon, Hervé Carfantan
BERNOULLI-GAUSSIAN SPECTRAL ANALYSIS OF UNEVENLY SPACED ASTROPHYSICAL DATA Sébastien Bourguignon, Hervé Carfantan Laboratoire d Astrophysique de l Observatoire Midi-Pyrénées, UMR 557 CNRS/UPS avenue Edouard
More informationMinimum entropy deconvolution with frequency-domain constraints
GEOPHYSICS, VOL. 59, NO. 6 (JUNE 1994); P. 938-945, 9 FIGS., 1 TABLE. Minimum entropy deconvolution with frequency-domain constraints Mauricio D. Sacchi*, Danilo R. Velis*, and Alberto H. Cominguez ABSTRACT
More informationHidden Markov Modelling
Hidden Markov Modelling Introduction Problem formulation Forward-Backward algorithm Viterbi search Baum-Welch parameter estimation Other considerations Multiple observation sequences Phone-based models
More informationExample: Ground Motion Attenuation
Example: Ground Motion Attenuation Problem: Predict the probability distribution for Peak Ground Acceleration (PGA), the level of ground shaking caused by an earthquake Earthquake records are used to update
More informationPost-stack inversion of the Hussar low frequency seismic data
Inversion of the Hussar low frequency seismic data Post-stack inversion of the Hussar low frequency seismic data Patricia E. Gavotti, Don C. Lawton, Gary F. Margrave and J. Helen Isaac ABSTRACT The Hussar
More informationLPA-ICI Applications in Image Processing
LPA-ICI Applications in Image Processing Denoising Deblurring Derivative estimation Edge detection Inverse halftoning Denoising Consider z (x) =y (x)+η (x), wherey is noise-free image and η is noise. assume
More informationTiming Recovery at Low SNR Cramer-Rao bound, and outperforming the PLL
T F T I G E O R G A I N S T I T U T E O H E O F E A L P R O G R ESS S A N D 1 8 8 5 S E R V L O G Y I C E E C H N O Timing Recovery at Low SNR Cramer-Rao bound, and outperforming the PLL Aravind R. Nayak
More information2 Statistical Estimation: Basic Concepts
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:
More informationState-Space Methods for Inferring Spike Trains from Calcium Imaging
State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline
More informationResearch Article A Seismic Blind Deconvolution Algorithm Based on Bayesian Compressive Sensing
Mathematical Problems in Engineering Volume 215, Article ID 427153, 11 pages http://dx.doi.org/1.1155/215/427153 Research Article A Seismic Blind Deconvolution Algorithm Based on Bayesian Compressive Sensing
More informationOverview of Beamforming
Overview of Beamforming Arye Nehorai Preston M. Green Department of Electrical and Systems Engineering Washington University in St. Louis March 14, 2012 CSSIP Lab 1 Outline Introduction Spatial and temporal
More informationBayesian inference J. Daunizeau
Bayesian inference J. Daunizeau Brain and Spine Institute, Paris, France Wellcome Trust Centre for Neuroimaging, London, UK Overview of the talk 1 Probabilistic modelling and representation of uncertainty
More informationCovariance Estimation for High Dimensional Data Vectors
Covariance Estimation for High Dimensional Data Vectors Charles A. Bouman Purdue University School of Electrical and Computer Engineering Co-authored with: Guangzhi Cao and Leonardo R Bachega Covariance
More informationStochastic vs Deterministic Pre-stack Inversion Methods. Brian Russell
Stochastic vs Deterministic Pre-stack Inversion Methods Brian Russell Introduction Seismic reservoir analysis techniques utilize the fact that seismic amplitudes contain information about the geological
More informationAround the Speaker De-Identification (Speaker diarization for de-identification ++) Itshak Lapidot Moez Ajili Jean-Francois Bonastre
Around the Speaker De-Identification (Speaker diarization for de-identification ++) Itshak Lapidot Moez Ajili Jean-Francois Bonastre The 2 Parts HDM based diarization System The homogeneity measure 2 Outline
More informationBayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems
Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of
More informationMarkov-Switching GARCH Models and Applications to Digital Processing of Speech Signals
Markov-Switching GARCH Models and Applications to Digital Processing of Speech Signals Electrical Engineering Department Technion - Israel Institute of Technology Supervised by: Prof. Israel Cohen Outline
More informationA Continuation Approach to Estimate a Solution Path of Mixed L2-L0 Minimization Problems
A Continuation Approach to Estimate a Solution Path of Mixed L2-L Minimization Problems Junbo Duan, Charles Soussen, David Brie, Jérôme Idier Centre de Recherche en Automatique de Nancy Nancy-University,
More informationComputer Vision & Digital Image Processing
Computer Vision & Digital Image Processing Image Restoration and Reconstruction I Dr. D. J. Jackson Lecture 11-1 Image restoration Restoration is an objective process that attempts to recover an image
More informationGabor Deconvolution. Gary Margrave and Michael Lamoureux
Gabor Deconvolution Gary Margrave and Michael Lamoureux = Outline The Gabor idea The continuous Gabor transform The discrete Gabor transform A nonstationary trace model Gabor deconvolution Examples = Gabor
More informationLinear Dynamical Systems (Kalman filter)
Linear Dynamical Systems (Kalman filter) (a) Overview of HMMs (b) From HMMs to Linear Dynamical Systems (LDS) 1 Markov Chains with Discrete Random Variables x 1 x 2 x 3 x T Let s assume we have discrete
More informationVariations. ECE 6540, Lecture 10 Maximum Likelihood Estimation
Variations ECE 6540, Lecture 10 Last Time BLUE (Best Linear Unbiased Estimator) Formulation Advantages Disadvantages 2 The BLUE A simplification Assume the estimator is a linear system For a single parameter
More informationChapter 4: Continuous channel and its capacity
meghdadi@ensil.unilim.fr Reference : Elements of Information Theory by Cover and Thomas Continuous random variable Gaussian multivariate random variable AWGN Band limited channel Parallel channels Flat
More informationRobust one-step (deconvolution + integration) seismic inversion in the frequency domain Ivan Priezzhev* and Aaron Scollard, Schlumberger
Robust one-step (deconvolution + integration) seismic inversion in the frequency domain Ivan Priezzhev and Aaron Scollard, Schlumberger Summary Seismic inversion requires two main operations relative to
More informationMCMC Sampling for Bayesian Inference using L1-type Priors
MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling
More information10. Hidden Markov Models (HMM) for Speech Processing. (some slides taken from Glass and Zue course)
10. Hidden Markov Models (HMM) for Speech Processing (some slides taken from Glass and Zue course) Definition of an HMM The HMM are powerful statistical methods to characterize the observed samples of
More information23855 Rock Physics Constraints on Seismic Inversion
23855 Rock Physics Constraints on Seismic Inversion M. Sams* (Ikon Science Ltd) & D. Saussus (Ikon Science) SUMMARY Seismic data are bandlimited, offset limited and noisy. Consequently interpretation of
More informationBASICS OF DETECTION AND ESTIMATION THEORY
BASICS OF DETECTION AND ESTIMATION THEORY 83050E/158 In this chapter we discuss how the transmitted symbols are detected optimally from a noisy received signal (observation). Based on these results, optimal
More informationStatistics & Data Sciences: First Year Prelim Exam May 2018
Statistics & Data Sciences: First Year Prelim Exam May 2018 Instructions: 1. Do not turn this page until instructed to do so. 2. Start each new question on a new sheet of paper. 3. This is a closed book
More informationSuper-Resolution. Shai Avidan Tel-Aviv University
Super-Resolution Shai Avidan Tel-Aviv University Slide Credits (partial list) Ric Szelisi Steve Seitz Alyosha Efros Yacov Hel-Or Yossi Rubner Mii Elad Marc Levoy Bill Freeman Fredo Durand Sylvain Paris
More informationBlind Equalization via Particle Filtering
Blind Equalization via Particle Filtering Yuki Yoshida, Kazunori Hayashi, Hideaki Sakai Department of System Science, Graduate School of Informatics, Kyoto University Historical Remarks A sequential Monte
More informationExpectation propagation for signal detection in flat-fading channels
Expectation propagation for signal detection in flat-fading channels Yuan Qi MIT Media Lab Cambridge, MA, 02139 USA yuanqi@media.mit.edu Thomas Minka CMU Statistics Department Pittsburgh, PA 15213 USA
More informationIntroduction to Bayesian methods in inverse problems
Introduction to Bayesian methods in inverse problems Ville Kolehmainen 1 1 Department of Applied Physics, University of Eastern Finland, Kuopio, Finland March 4 2013 Manchester, UK. Contents Introduction
More informationLecture : Probabilistic Machine Learning
Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning
More informationLikelihood, MLE & EM for Gaussian Mixture Clustering. Nick Duffield Texas A&M University
Likelihood, MLE & EM for Gaussian Mixture Clustering Nick Duffield Texas A&M University Probability vs. Likelihood Probability: predict unknown outcomes based on known parameters: P(x q) Likelihood: estimate
More informationEM Algorithm. Expectation-maximization (EM) algorithm.
EM Algorithm Outline: Expectation-maximization (EM) algorithm. Examples. Reading: A.P. Dempster, N.M. Laird, and D.B. Rubin, Maximum likelihood from incomplete data via the EM algorithm, J. R. Stat. Soc.,
More informationV003 How Reliable Is Statistical Wavelet Estimation?
V003 How Reliable Is Statistical Wavelet Estimation? J.A. Edgar* (BG Group) & M. van der Baan (University of Alberta) SUMMARY Well logs are often used for the estimation of seismic wavelets. The phase
More informationGauge optimization and duality
1 / 54 Gauge optimization and duality Junfeng Yang Department of Mathematics Nanjing University Joint with Shiqian Ma, CUHK September, 2015 2 / 54 Outline Introduction Duality Lagrange duality Fenchel
More informationA. Dong, N. Garcia, A.M. Haimovich
A. Dong, N. Garcia, A.M. Haimovich 2 Goal: Localization (geolocation) of unknown RF emitters in multipath environments Challenges: Conventional methods such as TDOA based on line-of-sight (LOS) Non-line-of-sight
More informationOptimal Speech Enhancement Under Signal Presence Uncertainty Using Log-Spectral Amplitude Estimator
1 Optimal Speech Enhancement Under Signal Presence Uncertainty Using Log-Spectral Amplitude Estimator Israel Cohen Lamar Signal Processing Ltd. P.O.Box 573, Yokneam Ilit 20692, Israel E-mail: icohen@lamar.co.il
More informationComparison of stochastic and variational solutions to ASL fmri data analysis
Comparison of stochastic and variational solutions to ASL fmri data analysis Aina Frau-Pascual 1,3, Florence Forbes 1, and Philippe Ciuciu 2,3 p1q INRIA, Univ. Grenoble Alpes, LJK, Grenoble, France p2q
More informationSemi-Blind approaches to source separation: introduction to the special session
Semi-Blind approaches to source separation: introduction to the special session Massoud BABAIE-ZADEH 1 Christian JUTTEN 2 1- Sharif University of Technology, Tehran, IRAN 2- Laboratory of Images and Signals
More informationThese slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop
Music and Machine Learning (IFT68 Winter 8) Prof. Douglas Eck, Université de Montréal These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop
More informationKalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein
Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time
More informationAlgorithmisches Lernen/Machine Learning
Algorithmisches Lernen/Machine Learning Part 1: Stefan Wermter Introduction Connectionist Learning (e.g. Neural Networks) Decision-Trees, Genetic Algorithms Part 2: Norman Hendrich Support-Vector Machines
More informationAttenuation of Multiples in Marine Seismic Data. Geology /23/06
Attenuation of Multiples in Marine Seismic Data Geology 377 10/23/06 Marine Seismic Surveying Marine seismic surveying is performed to support marine environmental and civil engineering projects in the
More informationRigorous Dynamics and Consistent Estimation in Arbitrarily Conditioned Linear Systems
1 Rigorous Dynamics and Consistent Estimation in Arbitrarily Conditioned Linear Systems Alyson K. Fletcher, Mojtaba Sahraee-Ardakan, Philip Schniter, and Sundeep Rangan Abstract arxiv:1706.06054v1 cs.it
More informationAn example of Bayesian reasoning Consider the one-dimensional deconvolution problem with various degrees of prior information.
An example of Bayesian reasoning Consider the one-dimensional deconvolution problem with various degrees of prior information. Model: where g(t) = a(t s)f(s)ds + e(t), a(t) t = (rapidly). The problem,
More informationPART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics
Table of Preface page xi PART I INTRODUCTION 1 1 The meaning of probability 3 1.1 Classical definition of probability 3 1.2 Statistical definition of probability 9 1.3 Bayesian understanding of probability
More informationSTA414/2104. Lecture 11: Gaussian Processes. Department of Statistics
STA414/2104 Lecture 11: Gaussian Processes Department of Statistics www.utstat.utoronto.ca Delivered by Mark Ebden with thanks to Russ Salakhutdinov Outline Gaussian Processes Exam review Course evaluations
More informationGenerative Models and Stochastic Algorithms for Population Average Estimation and Image Analysis
Generative Models and Stochastic Algorithms for Population Average Estimation and Image Analysis Stéphanie Allassonnière CIS, JHU July, 15th 28 Context : Computational Anatomy Context and motivations :
More informationDetection of Anomalies in Texture Images using Multi-Resolution Features
Detection of Anomalies in Texture Images using Multi-Resolution Features Electrical Engineering Department Supervisor: Prof. Israel Cohen Outline Introduction 1 Introduction Anomaly Detection Texture Segmentation
More informationLocal discontinuity measures for 3-D seismic data
GEOPHYSICS, VOL. 67, NO. 6 (NOVEMBER-DECEMBER 2002); P. 1933 1945, 10 FIGS. 10.1190/1.1527094 Local discontinuity measures for 3-D seismic data Israel Cohen and Ronald R. Coifman ABSTRACT In this work,
More informationModern Methods of Data Analysis - WS 07/08
Modern Methods of Data Analysis Lecture VIc (19.11.07) Contents: Maximum Likelihood Fit Maximum Likelihood (I) Assume N measurements of a random variable Assume them to be independent and distributed according
More informationSparsity and Morphological Diversity in Source Separation. Jérôme Bobin IRFU/SEDI-Service d Astrophysique CEA Saclay - France
Sparsity and Morphological Diversity in Source Separation Jérôme Bobin IRFU/SEDI-Service d Astrophysique CEA Saclay - France Collaborators - Yassir Moudden - CEA Saclay, France - Jean-Luc Starck - CEA
More informationBayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016
Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several
More informationIntroduction to Machine Learning
Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin
More informationOutline Lecture 2 2(32)
Outline Lecture (3), Lecture Linear Regression and Classification it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic
More informationCalculation of the sun s acoustic impulse response by multidimensional
Calculation of the sun s acoustic impulse response by multidimensional spectral factorization J. E. Rickett and J. F. Claerbout Geophysics Department, Stanford University, Stanford, CA 94305, USA Abstract.
More informationBayesian Defect Signal Analysis
Electrical and Computer Engineering Publications Electrical and Computer Engineering 26 Bayesian Defect Signal Analysis Aleksandar Dogandžić Iowa State University, ald@iastate.edu Benhong Zhang Iowa State
More informationPractical Bayesian Optimization of Machine Learning. Learning Algorithms
Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that
More informationLecture 15: Thu Feb 28, 2019
Lecture 15: Thu Feb 28, 2019 Announce: HW5 posted Lecture: The AWGN waveform channel Projecting temporally AWGN leads to spatially AWGN sufficiency of projection: irrelevancy theorem in waveform AWGN:
More informationExpectation Maximization
Expectation Maximization Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr 1 /
More informationTRINICON: A Versatile Framework for Multichannel Blind Signal Processing
TRINICON: A Versatile Framework for Multichannel Blind Signal Processing Herbert Buchner, Robert Aichner, Walter Kellermann {buchner,aichner,wk}@lnt.de Telecommunications Laboratory University of Erlangen-Nuremberg
More informationDefect Detection Using Hidden Markov Random Fields
Electrical and Computer Engineering Publications Electrical and Computer Engineering 5 Defect Detection Using Hidden Markov Random Fields Aleksandar Dogandžić Iowa State University, ald@iastate.edu Nawanat
More informationIntegrated Non-Factorized Variational Inference
Integrated Non-Factorized Variational Inference Shaobo Han, Xuejun Liao and Lawrence Carin Duke University February 27, 2014 S. Han et al. Integrated Non-Factorized Variational Inference February 27, 2014
More informationSUMMARY INTRODUCTION LOCALIZED PHASE ESTIMATION
Local similarity with the envelope as a seismic phase detector Sergey Fomel, The University of Texas at Austin, and Mirko van der Baan, University of Alberta SUMMARY We propose a new seismic attribute,
More informationDeconvolution imaging condition for reverse-time migration
Stanford Exploration Project, Report 112, November 11, 2002, pages 83 96 Deconvolution imaging condition for reverse-time migration Alejandro A. Valenciano and Biondo Biondi 1 ABSTRACT The reverse-time
More informationMyopic sparse image reconstruction with application to MRFM
Myopic sparse image reconstruction with application to MRFM Se Un Park a,*, Nicolas Dobigeon b,* and Alfred O. Hero a a University of Michigan, EECS Department, Ann Arbor, USA; b University of Toulouse,
More informationStructured signal recovery from non-linear and heavy-tailed measurements
Structured signal recovery from non-linear and heavy-tailed measurements Larry Goldstein* Stanislav Minsker* Xiaohan Wei # *Department of Mathematics # Department of Electrical Engineering UniversityofSouthern
More informationChapter 20. Deep Generative Models
Peng et al.: Deep Learning and Practice 1 Chapter 20 Deep Generative Models Peng et al.: Deep Learning and Practice 2 Generative Models Models that are able to Provide an estimate of the probability distribution
More informationBayesian Estimation of Input Output Tables for Russia
Bayesian Estimation of Input Output Tables for Russia Oleg Lugovoy (EDF, RANE) Andrey Polbin (RANE) Vladimir Potashnikov (RANE) WIOD Conference April 24, 2012 Groningen Outline Motivation Objectives Bayesian
More informationParameter Estimation in a Moving Horizon Perspective
Parameter Estimation in a Moving Horizon Perspective State and Parameter Estimation in Dynamical Systems Reglerteknik, ISY, Linköpings Universitet State and Parameter Estimation in Dynamical Systems OUTLINE
More information6.867 Machine Learning
6.867 Machine Learning Problem set 1 Due Thursday, September 19, in class What and how to turn in? Turn in short written answers to the questions explicitly stated, and when requested to explain or prove.
More informationENGINEERING TRIPOS PART IIB: Technical Milestone Report
ENGINEERING TRIPOS PART IIB: Technical Milestone Report Statistical enhancement of multichannel audio from transcription turntables Yinhong Liu Supervisor: Prof. Simon Godsill 1 Abstract This milestone
More information10. Multi-objective least squares
L Vandenberghe ECE133A (Winter 2018) 10 Multi-objective least squares multi-objective least squares regularized data fitting control estimation and inversion 10-1 Multi-objective least squares we have
More informationChapter 7: Channel coding:convolutional codes
Chapter 7: : Convolutional codes University of Limoges meghdadi@ensil.unilim.fr Reference : Digital communications by John Proakis; Wireless communication by Andreas Goldsmith Encoder representation Communication
More informationA Petroleum Geologist's Guide to Seismic Reflection
A Petroleum Geologist's Guide to Seismic Reflection William Ashcroft WILEY-BLACKWELL A John Wiley & Sons, Ltd., Publication Contents Preface Acknowledgements xi xiii Part I Basic topics and 2D interpretation
More informationImage Noise: Detection, Measurement and Removal Techniques. Zhifei Zhang
Image Noise: Detection, Measurement and Removal Techniques Zhifei Zhang Outline Noise measurement Filter-based Block-based Wavelet-based Noise removal Spatial domain Transform domain Non-local methods
More informationDetection ASTR ASTR509 Jasper Wall Fall term. William Sealey Gosset
ASTR509-14 Detection William Sealey Gosset 1876-1937 Best known for his Student s t-test, devised for handling small samples for quality control in brewing. To many in the statistical world "Student" was
More informationAn Introduction to Expectation-Maximization
An Introduction to Expectation-Maximization Dahua Lin Abstract This notes reviews the basics about the Expectation-Maximization EM) algorithm, a popular approach to perform model estimation of the generative
More informationarxiv: v3 [physics.geo-ph] 9 Jan 2013
Short-time homomorphic wavelet estimation arxiv:1209.0196v3 [physics.geo-ph] 9 Jan 2013 Roberto Henry Herrera and Mirko van der Baan Department of Physics, University of Alberta, Edmonton T6G 2E1, CA E-mail:
More information9 Multi-Model State Estimation
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 9 Multi-Model State
More informationModule 1 - Signal estimation
, Arraial do Cabo, 2009 Module 1 - Signal estimation Sérgio M. Jesus (sjesus@ualg.pt) Universidade do Algarve, PT-8005-139 Faro, Portugal www.siplab.fct.ualg.pt February 2009 Outline of Module 1 Parameter
More informationData-aided and blind synchronization
PHYDYAS Review Meeting 2009-03-02 Data-aided and blind synchronization Mario Tanda Università di Napoli Federico II Dipartimento di Ingegneria Biomedica, Elettronicae delle Telecomunicazioni Via Claudio
More informationSequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007
Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember
More informationA latent variable modelling approach to the acoustic-to-articulatory mapping problem
A latent variable modelling approach to the acoustic-to-articulatory mapping problem Miguel Á. Carreira-Perpiñán and Steve Renals Dept. of Computer Science, University of Sheffield {miguel,sjr}@dcs.shef.ac.uk
More informationBayesian Decision and Bayesian Learning
Bayesian Decision and Bayesian Learning Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1 / 30 Bayes Rule p(x ω i
More informationMonte Carlo methods for sampling-based Stochastic Optimization
Monte Carlo methods for sampling-based Stochastic Optimization Gersende FORT LTCI CNRS & Telecom ParisTech Paris, France Joint works with B. Jourdain, T. Lelièvre, G. Stoltz from ENPC and E. Kuhn from
More informationEfficient Variational Inference in Large-Scale Bayesian Compressed Sensing
Efficient Variational Inference in Large-Scale Bayesian Compressed Sensing George Papandreou and Alan Yuille Department of Statistics University of California, Los Angeles ICCV Workshop on Information
More informationGaussian Processes for Audio Feature Extraction
Gaussian Processes for Audio Feature Extraction Dr. Richard E. Turner (ret26@cam.ac.uk) Computational and Biological Learning Lab Department of Engineering University of Cambridge Machine hearing pipeline
More informationState Space and Hidden Markov Models
State Space and Hidden Markov Models Kunsch H.R. State Space and Hidden Markov Models. ETH- Zurich Zurich; Aliaksandr Hubin Oslo 2014 Contents 1. Introduction 2. Markov Chains 3. Hidden Markov and State
More informationPaul Karapanagiotidis ECO4060
Paul Karapanagiotidis ECO4060 The way forward 1) Motivate why Markov-Chain Monte Carlo (MCMC) is useful for econometric modeling 2) Introduce Markov-Chain Monte Carlo (MCMC) - Metropolis-Hastings (MH)
More informationCS839: Probabilistic Graphical Models. Lecture 7: Learning Fully Observed BNs. Theo Rekatsinas
CS839: Probabilistic Graphical Models Lecture 7: Learning Fully Observed BNs Theo Rekatsinas 1 Exponential family: a basic building block For a numeric random variable X p(x ) =h(x)exp T T (x) A( ) = 1
More informationReducing The Computational Cost of Bayesian Indoor Positioning Systems
Reducing The Computational Cost of Bayesian Indoor Positioning Systems Konstantinos Kleisouris, Richard P. Martin Computer Science Department Rutgers University WINLAB Research Review May 15 th, 2006 Motivation
More informationAcoustic MIMO Signal Processing
Yiteng Huang Jacob Benesty Jingdong Chen Acoustic MIMO Signal Processing With 71 Figures Ö Springer Contents 1 Introduction 1 1.1 Acoustic MIMO Signal Processing 1 1.2 Organization of the Book 4 Part I
More information