Main Menu. Summary. Methodology

Similar documents
Chapter 9: Statistical Inference and the Relationship between Two Variables

Multigradient for Neural Networks for Equalizers 1

Mud-rock line estimation via robust locally weighted scattering smoothing method

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Supporting Information

Uncertainty as the Overlap of Alternate Conditional Distributions

Lecture Notes on Linear Regression

Chapter 11: Simple Linear Regression and Correlation

Radial-Basis Function Networks

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Non-linear Canonical Correlation Analysis Using a RBF Network

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Numerical Heat and Mass Transfer

A Robust Method for Calculating the Correlation Coefficient

SDMML HT MSc Problem Sheet 4

Generalized Linear Methods

Hopfield Training Rules 1 N

De-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG

Adaptive RFID Indoor Positioning Technology for Wheelchair Home Health Care Robot. T. C. Kuo

Evaluation of classifiers MLPs

15-381: Artificial Intelligence. Regression and cross validation

Main Menu. characterization using surface reflection seismic data and sonic logs. Summary

Pattern Classification

Chapter 13: Multiple Regression

Research Article Green s Theorem for Sign Data

Neural Networks & Learning

Statistics for Managers Using Microsoft Excel/SPSS Chapter 13 The Simple Linear Regression Model and Correlation

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

Statistics for Economics & Business

Comparison of Regression Lines

Statistics for Business and Economics

Lecture 6: Introduction to Linear Regression

EEE 241: Linear Systems

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression

Lecture 9: Linear regression: centering, hypothesis testing, multiple covariates, and confounding

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

LECTURE 9 CANONICAL CORRELATION ANALYSIS

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH

Lecture 9: Linear regression: centering, hypothesis testing, multiple covariates, and confounding

Negative Binomial Regression

Week 5: Neural Networks

Uncertainty and auto-correlation in. Measurement

Constructing Control Process for Wafer Defects Using Data Mining Technique

/ n ) are compared. The logic is: if the two

Global Sensitivity. Tuesday 20 th February, 2018

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1

is the calculated value of the dependent variable at point i. The best parameters have values that minimize the squares of the errors

Solving Nonlinear Differential Equations by a Neural Network Method

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system

b ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere

VQ widely used in coding speech, image, and video

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function

Explaining the Stein Paradox

p 1 c 2 + p 2 c 2 + p 3 c p m c 2

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)

Relevance Vector Machines Explained

Which Separator? Spring 1

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Automatic Object Trajectory- Based Motion Recognition Using Gaussian Mixture Models

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger

MAXIMUM A POSTERIORI TRANSDUCTION

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

Chapter 4. Velocity analysis

[The following data appear in Wooldridge Q2.3.] The table below contains the ACT score and college GPA for eight college students.

Report on Image warping

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14

APPENDIX F A DISPLACEMENT-BASED BEAM ELEMENT WITH SHEAR DEFORMATIONS. Never use a Cubic Function Approximation for a Non-Prismatic Beam

Y = β 0 + β 1 X 1 + β 2 X β k X k + ε

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING

Finite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin

3D petrophysical modeling - 1-3D Petrophysical Modeling Usning Complex Seismic Attributes and Limited Well Log Data

Classification as a Regression Problem

T E C O L O T E R E S E A R C H, I N C.

Feature Selection: Part 1

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

An identification algorithm of model kinetic parameters of the interfacial layer growth in fiber composites

Unified Subspace Analysis for Face Recognition

Linear Regression Analysis: Terminology and Notation

Cokriging Partial Grades - Application to Block Modeling of Copper Deposits

Linear regression. Regression Models. Chapter 11 Student Lecture Notes Regression Analysis is the

Chapter 15 Student Lecture Notes 15-1

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

Natural Language Processing and Information Retrieval

System identifications by SIRMs models with linear transformation of input variables

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material

Transient Stability Assessment of Power System Based on Support Vector Machine

Linear Approximation with Regularization and Moving Least Squares

International Journal of Pure and Applied Sciences and Technology

Chapter 15 - Multiple Regression

Basic Business Statistics, 10/e

Multiple Sound Source Location in 3D Space with a Synchronized Neural System

x i1 =1 for all i (the constant ).

Robert Eisberg Second edition CH 09 Multielectron atoms ground states and x-ray excitations

10-701/ Machine Learning, Fall 2005 Homework 3

Application research on rough set -neural network in the fault diagnosis system of ball mill

Transcription:

an enu 3D Petrophyscal modelng usng complex sesmc attrbutes and lmted ell log data ehd Efteharfar* and De-Hua Han, Roc Physcs Lab, Unversty of Houston Summary A method for 3D modelng and nterpretaton of log propertes from complex sesmc attrbutes (obtaned from 3D post stac sesmc data) s developed by ntegratng Prncpal Component Analyss and Local Lnear odelng. Complex sesmc attrbutes have non-lnear relatonshps th petrophyscal propertes of rocs. hese complcated relatonshps can be approxmated usng statstcal methods. hs method has been tested successfully on real data sets. Log propertes (sonc, gamma ray densty etc.) ere predcted at the locaton of the second ell (blnd ell test). It has proved to or th lmted log nformaton (data from one ell) hereas conventonal methods, such as geostatstcs, used for ths purpose need ell log nformaton from several ells to correlate sesmc and ell data. Once the performance of the model s verfed by blnd ell test, 3D log volumes can be calculated from 3D sesmc attrbute data. Introducton Sesmc data are routnely and effectvely used to estmate the structure of reservor bodes. Also they have been ncreasngly used for estmatng the spatal dstrbuton of roc propertes. any authors have nvestgated the possble relatons beteen ndvdual sesmc attrbutes and roc propertes (or structural ndcatons). aner et al. 994 publshed a lst of such relatons. he dea of usng multple sesmc attrbutes to predct log propertes as frst proposed by Schultz et al. (994). hey used log data from 5 ells to tran th sesmc attrbutes and predct roc propertes. Hampson et al. (00) also explans use of dfferent neural netors for mult-attrbute analyss and reservor property predcton. Whle e no that sesmc sgnals features are drectly caused by roc physcs phenomena, the lns beteen these to are complex and dffcult to derve theoretcally. Sesmc response depends on many varables, such as temperature, volume of clay, overburden, pressure, nature and geometry of the layerng, and other factors hch affect elastc and absorpton response. hese complex relatons can vary from one layer to another, and even thn a sngle layer or reservor compartment (Schultz et al., 994). Schultz et al. (994) shoed that smultaneous use of sesmc attrbutes th ell log data leads to better predcton of reservor or roc propertes, compared to estmatons usng only ell data. A reasonable ay to combne sesmc attrbutes and ell log data for property predcton should nclude a statstcal method. ethodology he data used n ths study belongs to a shaly sandstone reservor n the ddle-east. o ells ere selected for ths study. One for tranng the netor and the other one for cross-valdaton (the dstance beteen to ells s approxmately 4 m).he man sesmc attrbutes that are used n ths study are ampltude envelope, frst and second dervatve of ampltude, quadrature trace, nstantaneous frequency and domnant frequency. Attrbute data are normalzed and reduced usng prncpal component analyss before modelng tas. he proposed unsupervsed clusterng method n ths paper s Fuzzy-Self Organzng ap (FSO) n hch the Gustafson-Kessel algorthm s ntegrated n the learnng and updatng strateges of the Kohonen Self Organzng aps (Kohonen, 989). Bezde et al. (99) ntroduced a FSO method by combnng the FC clusterng n the learnng rate and updatng strateges of the SO and shoed the superorty of ths method compared to SO. Also, Hu et all. (004) shos that FSO s more effcent than the SO and vector quantzaton n both speed and accuracy. In ths method, equaton () s devsed for updatng the centers of clusters (Bezde et al., 99): l ( t).( ul c ( t)) l ( + ) ( ) + () l ( t) l c t c t here Dl l ( t) () K ( ) m D ( t) lm l, Σ l Σ l Σ l D ( t) u c ( u c ( t)) ( u c ( t)) (3) Σ F.( v det( F )) (4) F l ( u c ( t))( u c ( t)) l l l l l here l,, s the number of data ponts,,,k or C s the number of clusters and,, s the dmenson of the nput data (number of attrbutes), c s the center of (5) 0 SEG SEG San Antono 0 Annual eetng 887

an enu 3D Petrophyscal modelng cluster, s the membershp degree of data samples and t s the number of teraton. (elles, 00). After clusterng the data, centers can be embedded n the hdden layer of the local lnear neuro-fuzzy netors. he netor structure of a local lnear neuro-fuzzy model s depcted n Fgure.. Each neuron realzes a local lnear model (LL) and an assocated valdty functon that determnes the regon of valdty of the LL. he outputs of the LLs are (elles, 00) y ˆ 0 + u + u +... + p u p, (6),,,,, here s the number of the data samples, s the number of the local lnear models (n other ords, clusters), p s the dmensonalty of the nput space (number of attrbutes) and denotes the LL parameters for neuron. he valdty functons form a partton of unty,.e., they are normalzed such that Φ ( u ) (7) he output of a local lnear neuro-fuzzy model s: yˆ ( 0 u u... u ) ( u ) (8). + + + + p p Φ yˆ herefore, the netor output s calculated as a eghted sum of the outputs of the local lnear models here Φ ( ) are nterpreted as the operatng pont dependent eghtng factors. he neuro-fuzzy netor nterpolates beteen dfferent LLs th the valdty functon. he valdty functons are chosen as normalzed Gaussans. (elles, 00). ( u ) Φ ( u ) (9) ( u ) here ( u c ) ( u c ) ( u ) exp + + (0). σ σ p In the global estmaton approach (elles, 00) all parameters are estmated smultaneously by least squares optmzaton. he parameter vector contans all n(p+) parameters of the local lnear neuro-fuzzy model n (8) th neurons and p nputs: [ 0,,,..., 0,,..., p ] () he assocated regresson matrx X for measured data ponts s (elles 00): ( sub) ( sub) ( sub) X [ X X... X ] () th Φ ( u ) uφ ( u ) uφ ( u ) u pφ ( u ) ( ( ) u ) u ( u ) u ( u ) u p ( u) sub X Φ Φ Φ Φ Φ ( u ) uφ ( u ) u Φ( u ) upφ ( u ) (3). yˆ [ yˆ ˆ ˆ y y ] he model output yˆ X (4). s gven by In global estmaton, the follong loss functon s mnmzed: e I mn (5) here e ˆ represent the model errors for data y y sample {u, y }. he globally optmal parameters ˆ can be calculated ether by drect methods or usng teratve methods such as Conugate Gradent method (aner, 00). In the drect method, (for >(p+)) ( ) ˆ X X X yˆ (6) Results Local Lnear odelng fnds the eghts relatng sesmc attrbutes to ell log data. Usng blnd ell test (crossvaldaton), optmum eghts are found through an teratve procedure that best map log nformaton from sesmc data. Once the best eghts are found, 3D volumes roc propertes can be calculated from 3D sesmc data. Fgure () shos the results of blnd ell test for modelng a- 0 SEG SEG San Antono 0 Annual eetng 888

an enu 3D Petrophyscal modelng densty (th correlaton coeffcent of 85%), b- AI (th correlaton coeffcent of 87%), c- Sonc (th correlaton coeffcent of 84%) and d- GR log (th correlaton coeffcent of 73%). he modeled logs are shon n blue and the actual measured values are shon n red. Horzontal axes sho the log values and vertcal axes sho sample number (depth). After fndng the optmal model parameters, they can be used to map roc property volumes from 3D sesmc data. Some modelng results are shon n Fgure (3); a) shos sesmc secton hch as selected for modelng. It shos a secton of about 50 ms length, b) shos the result of densty modelng here densty has been modeled for the entre sesmc secton and actual log measurements (at locaton of 50th trace here a blac arro ponts to the locaton of ell) are supermposed on the modeled bacground. As shon n secton b, our predctons n bacground are n an excellent agreement th the actual supermposed log measurements and hgh/lo densty layers are clearly dstngushable. Color bar shos the values of densty n gr/cc. Secton c) shos AI secton and the measured AI s supermposed. Color bar represents AI values n unts of g/sm. Secton d) shos the modeled Gamma Ray (shalness) values for the entre D secton, agan an excellent agreement and consstency s seen beteen modeled bacground GR values and supermposed measured GR values at ell locaton (~ 50th trace). Color bar shos GR values n unts of API. Fgure.. etor Structure of a statc local lnear neurofuzzy model th neurons for p nputs.(elles, 00) Conclusons A statstcal method has been developed for estmaton of roc propertes from sesmc data and tested successfully on real data sets. Unle conventonal methods that need many ells for reservor modelng, the proposed method ors th lmted ell log nformaton (one ell). Acnoledgement We than sponsors of Flud/DHI consortum for ther contnuous support. he frst author thans John Castagna for hs helpful comments. 0 SEG SEG San Antono 0 Annual eetng 889

an enu 3D Petrophyscal modelng Fgure : Blnd ell test (cross-valdaton) results for modelng a) densty, b) AI, c) Sonc and d) GR log, th correlaton coeffcents of 85%, 87%, 84% and 73% respectvely. Horzontal axes sho the log values and vertcal axes sho sample number (tme nterval s about 00 ms). a b d c Fgure 3: a) sesmc secton used for modelng log propertes, b) modeled densty log values n gr/cc unts, actual measured ell log values are supermposed at around 50th trace. c) odeled AI values n unts of g/sm. d) odeled GR log secton n unts of API. ore explanatons are provded n the text. 0 SEG SEG San Antono 0 Annual eetng 890

an enu EDIED REFERECES ote: hs reference lst s a copy-edted verson of the reference lst submtted by the author. Reference lsts for the 0 SEG echncal Program Expanded Abstracts have been copy edted so that references provded th the onlne metadata for each paper ll acheve a hgh degree of lnng to cted sources that appear on the Web. REFERECES Bezde, J. C., E. C.-K. sao, and. R. Pal, 99, Fuzzy Kohonen clusterng netors: IEEE Internatonal Conference on Fuzzy Systems, 035 043, do:0.09/fuzzy.99.58797. Hu, W., D. Xe,. an, and S. ayban, 004, Learnng actvty patterns usng fuzzy self-organzng neural netor: IEEE ransactons on Systems, an, and Cybernetcs, 34, no. 3, 68 66, do:0.09/scb.004.8689. Kohonen,., 989, Self-organzaton and assocatve memory, 3rd edton.: Sprnger-Verlag. elles, O., 00, onlnear system dentfcaton: Sprnger-Verlag. Schultz, P. S., Ronen, S., Hattor,., Corbett, C., and P. antran, 994, Sesmc guded estmaton of log propertes, parts,, and 3: he Leadng Edge, 3, 305 30, do:0.90/.43700; 674 678, do:0.90/.43707; and 770 776, do:0.90/.437036. urhan aner,., 00, Radal bass functon computaton of output layer neural eghts: Roc Sold Images, echncal Report. urhan aner,., J. S. Schuele, R. O Doherty, and E. Baysal, 994, Sesmc attrbutes revsted: 64th Annual Internatonal eetng, SEG, Expanded Abstracts, 04 06, do:0.90/.8709. 0 SEG SEG San Antono 0 Annual eetng 89