COMPLEX PRINCIPAL COMPONENT SPECTRA EXTRACTION

Similar documents
Principal Components Analysis of Spectral Components. (Principal Components Analysis) Hao Guo, Kurt J. Marfurt* and Jianlei Liu

Introduction to Machine Learning

HST.582J/6.555J/16.456J

Principal Components Analysis (PCA)

GEOG 4110/5100 Advanced Remote Sensing Lecture 15

Principal Component Analysis

(a)

LECTURE 16: PCA AND SVD

Q ESTIMATION WITHIN A FORMATION PROGRAM q_estimation

Principal Component Analysis. Applied Multivariate Statistics Spring 2012

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation)

Churning seismic attributes with principal component analysis

A Tutorial on Data Reduction. Principal Component Analysis Theoretical Discussion. By Shireen Elhabian and Aly Farag

7. Variable extraction and dimensionality reduction

1 Principal Components Analysis

PRINCIPAL COMPONENTS ANALYSIS

CHAPTER 4 PRINCIPAL COMPONENT ANALYSIS-BASED FUSION

W041 Faults and Fracture Detection based on Seismic Surface Orthogonal Decomposition

Structure in Data. A major objective in data analysis is to identify interesting features or structure in the data.

CS4495/6495 Introduction to Computer Vision. 8B-L2 Principle Component Analysis (and its use in Computer Vision)

Principal Component Analysis

What is Principal Component Analysis?

Proposition 42. Let M be an m n matrix. Then (32) N (M M)=N (M) (33) R(MM )=R(M)

UNIVERSITY OF OKLAHOMA GRADUATE COLLEGE Q ESTIMATE USING SPECTRAL DECOMPOSITION A THESIS SUBMITTED TO THE GRADUATE FACULTY

DATA MINING LECTURE 8. Dimensionality Reduction PCA -- SVD

Eigenvalues, Eigenvectors, and an Intro to PCA

Eigenvalues, Eigenvectors, and an Intro to PCA

Designing Information Devices and Systems II Fall 2015 Note 5

Exercises * on Principal Component Analysis

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

PCA, Kernel PCA, ICA

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

Principal Component Analysis (PCA) of AIRS Data

12.2 Dimensionality Reduction

Principal Component Analysis

G E INTERACTION USING JMP: AN OVERVIEW

PCA and admixture models

Polarization analysis on three-component seismic data

Principal Component Analysis

Neuroscience Introduction

Principal Component Analysis-I Geog 210C Introduction to Spatial Data Analysis. Chris Funk. Lecture 17

HST-582J/6.555J/16.456J Biomedical Signal and Image Processing Spring Laboratory Project 3 Blind Source Separation: Fetal & Maternal ECG

Lecture: Face Recognition and Feature Reduction

Advanced Introduction to Machine Learning CMU-10715

Principal Component Analysis

Lecture VIII Dim. Reduction (I)

ANOVA: Analysis of Variance - Part I

VAR Model. (k-variate) VAR(p) model (in the Reduced Form): Y t-2. Y t-1 = A + B 1. Y t + B 2. Y t-p. + ε t. + + B p. where:

Principal Component Analysis (PCA) CSC411/2515 Tutorial

Some Machine Learning Applications in Seismic Interpretation

Lecture: Face Recognition and Feature Reduction

Covariance and Principal Components

Principal Component Analysis!! Lecture 11!

Eigenstructure-based coherence computations as an aid to 3-D structural and stratigraphic mapping

Maximum variance formulation

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26

Covariance and Correlation Matrix

Wavelet Transform And Principal Component Analysis Based Feature Extraction

Machine Learning - MT & 14. PCA and MDS

Central limit theorem - go to web applet

Face Recognition and Biometric Systems

Principal Components Analysis: A How-To Manual for R

CS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works

Vulnerability of economic systems

Singular Value Decomposition

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

1 Singular Value Decomposition and Principal Component

Principal Components Theory Notes

Signal Analysis. Principal Component Analysis

Linear Dimensionality Reduction

CSE 554 Lecture 7: Alignment

Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition. Name:

Multivariate Statistics (I) 2. Principal Component Analysis (PCA)

Principal Component Analysis (PCA)

Singular Value Decomposition and Principal Component Analysis (PCA) I

Principal Component Analysis and Linear Discriminant Analysis

Eigenvalues, Eigenvectors, and an Intro to PCA

Announcements (repeat) Principal Components Analysis

PRINCIPAL COMPONENT ANALYSIS

Feature Extraction Techniques

Quantitative Understanding in Biology Principal Components Analysis

Unsupervised Learning: Dimensionality Reduction

Covariance to PCA. CS 510 Lecture #8 February 17, 2014

SPEECH ENHANCEMENT USING PCA AND VARIANCE OF THE RECONSTRUCTION ERROR IN DISTRIBUTED SPEECH RECOGNITION

Karhunen-Loève Transform KLT. JanKees van der Poel D.Sc. Student, Mechanical Engineering

Tutorial on Principal Component Analysis

Principle Components Analysis (PCA) Relationship Between a Linear Combination of Variables and Axes Rotation for PCA

PRINCIPAL COMPONENT ANALYSIS

Fundamentals of Matrices

Face Recognition. Lecture-14

Principal Dynamical Components

CSC 411 Lecture 12: Principal Component Analysis

Covariance to PCA. CS 510 Lecture #14 February 23, 2018

Dimensionality Reduction. CS57300 Data Mining Fall Instructor: Bruno Ribeiro

Fourier Analysis and Power Spectral Density

Gopalkrishna Veni. Project 4 (Active Shape Models)

Dimensionality Reduction

Transcription:

COMPLEX PRINCIPAL COMPONEN SPECRA EXRACION PROGRAM complex_pca_spectra Computing principal components o begin, click the Formation attributes tab in the AASPI-UIL window and select program complex_pca_spectra: Attribute-Assisted Seismic Processing and Interpretation Page 1

Program complex_pca_spectra performs complex stratal slice filter by Principal Component Analysis (PCA). he following window appears: First, enter the (1) name of the Spectral magnitude and (2) name spectral phase stratal slice input files(*.h) file you wish to filter, generated by Complex_stratal_slice as well as a Unique Proect Name and Suffix as you have done for other AASPI programs. (3) and (4) are the properties of the input complex stratal slices, and would be filled automatically. Here, you can change the lowest and highest frequency (in Hz) to be analyzed. Principal Component Analysis Seismic data are so noisy that we need to condition date before using Q estimation methods. Here we filter the stratal spectra slices by using Principal Component Analysis (PCA), an orthogonal linear transformation that estimates the correlation between measurements. here are a number of methods to implement the PCA algorithm. We employed the one Attribute-Assisted Seismic Processing and Interpretation Page 2

described by Guo et al. (2009). Given the spectra of a data volume, each frequency component represents a measurement. he measurements are normalized to zero-mean, then input to the PCA algorithm. If the spectra along a given horizon or stratal slice have the same shape, they will be completely defined by the first eigenspectrum. Eigenspectra that are associated with second, third, fourth and so on, largest eigenvalues reveal variations in the spectra that are critical to Q estimation algorithm. Principal Component Analysis is a linear orthogonal transformation, P p, p, p,, ], that transforms a set of correlated observations, [ 1 2 3 p m X [ x1, x2, x3,, x n ] where the vector x is the th observation, to a new domain such that the transformed data, Y [ y1, y2, y3,, y n ], consist of only uncorrelated events. he relationship between X and Y is represented by: he set of all vectors PX Y, (1) p makes up the principal components of X. Each vector y is the proection x onto all the principal components k o calculate P, we first compute the covariance matrix of X, C X : C X 1 XX n p., (2) Where the element C (, ) X i is the cross-correlation between two observation x i and x. he covariance matrix in the transformed domain, C Y, is related to CX by: C Y PCXP, (3) Where, C Y is a diagonal matrix since Y consists of uncorrelated events. CX is a symmetric positive definite matrix. herefore, its eigenvector matrix 1 V is an orthonormal matrix and satisfies V V. It can be verified that P V is a solution to equation (3). By picking P V, P can be rewritten as the sum of weighted outer-products of the principal components (eigenvalue decomposition): Attribute-Assisted Seismic Processing and Interpretation Page 3

m P e p p, (4) 1 Where e is the eigenvalue associated with the eigenvector p. Substituting equation (4) into equation (1) gives: Y e p p X m, (5) 1 Without loss of generality, suppose the eigenvalues e s are sorted in descending order e 1 e 2 e m 0. hen define Y k to be the partial sum ofy : k Yk e p p X, (6) 1 In practice, assuming that the large variance is associated with important structures and small variance is associated with noise, limiting the number of terms in the sum is a filter that enhances the SNR. he PCA can be extended to apply to complex data. he mathematics is the same as for real data, except the covariance matrix of X, C X, is a selfadoint matrix: H X. C C X Number of principal components (5) is the number to generate, in this case is 5. Others are the boxes for output file options. heir names can stand for their meanings. After executing the module, we can get filtered stratal slices generated by different principal components. Attribute-Assisted Seismic Processing and Interpretation Page 4

We can apply AASPI plot to draw the results, using magnitude slices as an instance. Here, we show the first two principal components. Attribute-Assisted Seismic Processing and Interpretation Page 5

We also output the PC reconstructed data. he reconstruction data have the same structure as the input data. As comparison, we show the original data as well. Since 5 principal components contain most of the information, the difference between them is not obvious. Attribute-Assisted Seismic Processing and Interpretation Page 6

o better understand how big portion every principal component cover, we plot the eigenvalues s magnitude. For now, we use some simple SEP commands and plot the spectral wavelets can be plotted by typing: Attribute-Assisted Seismic Processing and Interpretation Page 7

Graph < spectra_eigenvalue_docu_0.h Stube & For the t = 0.333s slice, note that the first 3 components have the most energy of the whole data, so for further filtering, you can ust use 1 or 2 principal components. We can also use the SEP commands to show the eigenvectors magnitude and phase: Attribute-Assisted Seismic Processing and Interpretation Page 8

Other outputs can be shown in a similar way. References Guo, H., K. J. Marfurt, and J. Liu, 2009, Principal component spectral analysis, Geophysics, 74, 35-43. Attribute-Assisted Seismic Processing and Interpretation Page 9