Main Menu. Summary. Methodology

Size: px
Start display at page:

Download "Main Menu. Summary. Methodology"

Transcription

1 an enu 3D Petrophyscal modelng usng complex sesmc attrbutes and lmted ell log data ehd Efteharfar* and De-Hua Han, Roc Physcs Lab, Unversty of Houston Summary A method for 3D modelng and nterpretaton of log propertes from complex sesmc attrbutes (obtaned from 3D post stac sesmc data) s developed by ntegratng Prncpal Component Analyss and Local Lnear odelng. Complex sesmc attrbutes have non-lnear relatonshps th petrophyscal propertes of rocs. hese complcated relatonshps can be approxmated usng statstcal methods. hs method has been tested successfully on real data sets. Log propertes (sonc, gamma ray densty etc.) ere predcted at the locaton of the second ell (blnd ell test). It has proved to or th lmted log nformaton (data from one ell) hereas conventonal methods, such as geostatstcs, used for ths purpose need ell log nformaton from several ells to correlate sesmc and ell data. Once the performance of the model s verfed by blnd ell test, 3D log volumes can be calculated from 3D sesmc attrbute data. Introducton Sesmc data are routnely and effectvely used to estmate the structure of reservor bodes. Also they have been ncreasngly used for estmatng the spatal dstrbuton of roc propertes. any authors have nvestgated the possble relatons beteen ndvdual sesmc attrbutes and roc propertes (or structural ndcatons). aner et al. 994 publshed a lst of such relatons. he dea of usng multple sesmc attrbutes to predct log propertes as frst proposed by Schultz et al. (994). hey used log data from 5 ells to tran th sesmc attrbutes and predct roc propertes. Hampson et al. (00) also explans use of dfferent neural netors for mult-attrbute analyss and reservor property predcton. Whle e no that sesmc sgnals features are drectly caused by roc physcs phenomena, the lns beteen these to are complex and dffcult to derve theoretcally. Sesmc response depends on many varables, such as temperature, volume of clay, overburden, pressure, nature and geometry of the layerng, and other factors hch affect elastc and absorpton response. hese complex relatons can vary from one layer to another, and even thn a sngle layer or reservor compartment (Schultz et al., 994). Schultz et al. (994) shoed that smultaneous use of sesmc attrbutes th ell log data leads to better predcton of reservor or roc propertes, compared to estmatons usng only ell data. A reasonable ay to combne sesmc attrbutes and ell log data for property predcton should nclude a statstcal method. ethodology he data used n ths study belongs to a shaly sandstone reservor n the ddle-east. o ells ere selected for ths study. One for tranng the netor and the other one for cross-valdaton (the dstance beteen to ells s approxmately 4 m).he man sesmc attrbutes that are used n ths study are ampltude envelope, frst and second dervatve of ampltude, quadrature trace, nstantaneous frequency and domnant frequency. Attrbute data are normalzed and reduced usng prncpal component analyss before modelng tas. he proposed unsupervsed clusterng method n ths paper s Fuzzy-Self Organzng ap (FSO) n hch the Gustafson-Kessel algorthm s ntegrated n the learnng and updatng strateges of the Kohonen Self Organzng aps (Kohonen, 989). Bezde et al. (99) ntroduced a FSO method by combnng the FC clusterng n the learnng rate and updatng strateges of the SO and shoed the superorty of ths method compared to SO. Also, Hu et all. (004) shos that FSO s more effcent than the SO and vector quantzaton n both speed and accuracy. In ths method, equaton () s devsed for updatng the centers of clusters (Bezde et al., 99): l ( t).( ul c ( t)) l ( + ) ( ) + () l ( t) l c t c t here Dl l ( t) () K ( ) m D ( t) lm l, Σ l Σ l Σ l D ( t) u c ( u c ( t)) ( u c ( t)) (3) Σ F.( v det( F )) (4) F l ( u c ( t))( u c ( t)) l l l l l here l,, s the number of data ponts,,,k or C s the number of clusters and,, s the dmenson of the nput data (number of attrbutes), c s the center of (5) 0 SEG SEG San Antono 0 Annual eetng 887

2 an enu 3D Petrophyscal modelng cluster, s the membershp degree of data samples and t s the number of teraton. (elles, 00). After clusterng the data, centers can be embedded n the hdden layer of the local lnear neuro-fuzzy netors. he netor structure of a local lnear neuro-fuzzy model s depcted n Fgure.. Each neuron realzes a local lnear model (LL) and an assocated valdty functon that determnes the regon of valdty of the LL. he outputs of the LLs are (elles, 00) y ˆ 0 + u + u p u p, (6),,,,, here s the number of the data samples, s the number of the local lnear models (n other ords, clusters), p s the dmensonalty of the nput space (number of attrbutes) and denotes the LL parameters for neuron. he valdty functons form a partton of unty,.e., they are normalzed such that Φ ( u ) (7) he output of a local lnear neuro-fuzzy model s: yˆ ( 0 u u... u ) ( u ) (8) p p Φ yˆ herefore, the netor output s calculated as a eghted sum of the outputs of the local lnear models here Φ ( ) are nterpreted as the operatng pont dependent eghtng factors. he neuro-fuzzy netor nterpolates beteen dfferent LLs th the valdty functon. he valdty functons are chosen as normalzed Gaussans. (elles, 00). ( u ) Φ ( u ) (9) ( u ) here ( u c ) ( u c ) ( u ) exp + + (0). σ σ p In the global estmaton approach (elles, 00) all parameters are estmated smultaneously by least squares optmzaton. he parameter vector contans all n(p+) parameters of the local lnear neuro-fuzzy model n (8) th neurons and p nputs: [ 0,,,..., 0,,..., p ] () he assocated regresson matrx X for measured data ponts s (elles 00): ( sub) ( sub) ( sub) X [ X X... X ] () th Φ ( u ) uφ ( u ) uφ ( u ) u pφ ( u ) ( ( ) u ) u ( u ) u ( u ) u p ( u) sub X Φ Φ Φ Φ Φ ( u ) uφ ( u ) u Φ( u ) upφ ( u ) (3). yˆ [ yˆ ˆ ˆ y y ] he model output yˆ X (4). s gven by In global estmaton, the follong loss functon s mnmzed: e I mn (5) here e ˆ represent the model errors for data y y sample {u, y }. he globally optmal parameters ˆ can be calculated ether by drect methods or usng teratve methods such as Conugate Gradent method (aner, 00). In the drect method, (for >(p+)) ( ) ˆ X X X yˆ (6) Results Local Lnear odelng fnds the eghts relatng sesmc attrbutes to ell log data. Usng blnd ell test (crossvaldaton), optmum eghts are found through an teratve procedure that best map log nformaton from sesmc data. Once the best eghts are found, 3D volumes roc propertes can be calculated from 3D sesmc data. Fgure () shos the results of blnd ell test for modelng a- 0 SEG SEG San Antono 0 Annual eetng 888

3 an enu 3D Petrophyscal modelng densty (th correlaton coeffcent of 85%), b- AI (th correlaton coeffcent of 87%), c- Sonc (th correlaton coeffcent of 84%) and d- GR log (th correlaton coeffcent of 73%). he modeled logs are shon n blue and the actual measured values are shon n red. Horzontal axes sho the log values and vertcal axes sho sample number (depth). After fndng the optmal model parameters, they can be used to map roc property volumes from 3D sesmc data. Some modelng results are shon n Fgure (3); a) shos sesmc secton hch as selected for modelng. It shos a secton of about 50 ms length, b) shos the result of densty modelng here densty has been modeled for the entre sesmc secton and actual log measurements (at locaton of 50th trace here a blac arro ponts to the locaton of ell) are supermposed on the modeled bacground. As shon n secton b, our predctons n bacground are n an excellent agreement th the actual supermposed log measurements and hgh/lo densty layers are clearly dstngushable. Color bar shos the values of densty n gr/cc. Secton c) shos AI secton and the measured AI s supermposed. Color bar represents AI values n unts of g/sm. Secton d) shos the modeled Gamma Ray (shalness) values for the entre D secton, agan an excellent agreement and consstency s seen beteen modeled bacground GR values and supermposed measured GR values at ell locaton (~ 50th trace). Color bar shos GR values n unts of API. Fgure.. etor Structure of a statc local lnear neurofuzzy model th neurons for p nputs.(elles, 00) Conclusons A statstcal method has been developed for estmaton of roc propertes from sesmc data and tested successfully on real data sets. Unle conventonal methods that need many ells for reservor modelng, the proposed method ors th lmted ell log nformaton (one ell). Acnoledgement We than sponsors of Flud/DHI consortum for ther contnuous support. he frst author thans John Castagna for hs helpful comments. 0 SEG SEG San Antono 0 Annual eetng 889

4 an enu 3D Petrophyscal modelng Fgure : Blnd ell test (cross-valdaton) results for modelng a) densty, b) AI, c) Sonc and d) GR log, th correlaton coeffcents of 85%, 87%, 84% and 73% respectvely. Horzontal axes sho the log values and vertcal axes sho sample number (tme nterval s about 00 ms). a b d c Fgure 3: a) sesmc secton used for modelng log propertes, b) modeled densty log values n gr/cc unts, actual measured ell log values are supermposed at around 50th trace. c) odeled AI values n unts of g/sm. d) odeled GR log secton n unts of API. ore explanatons are provded n the text. 0 SEG SEG San Antono 0 Annual eetng 890

5 an enu EDIED REFERECES ote: hs reference lst s a copy-edted verson of the reference lst submtted by the author. Reference lsts for the 0 SEG echncal Program Expanded Abstracts have been copy edted so that references provded th the onlne metadata for each paper ll acheve a hgh degree of lnng to cted sources that appear on the Web. REFERECES Bezde, J. C., E. C.-K. sao, and. R. Pal, 99, Fuzzy Kohonen clusterng netors: IEEE Internatonal Conference on Fuzzy Systems, , do:0.09/fuzzy Hu, W., D. Xe,. an, and S. ayban, 004, Learnng actvty patterns usng fuzzy self-organzng neural netor: IEEE ransactons on Systems, an, and Cybernetcs, 34, no. 3, 68 66, do:0.09/scb Kohonen,., 989, Self-organzaton and assocatve memory, 3rd edton.: Sprnger-Verlag. elles, O., 00, onlnear system dentfcaton: Sprnger-Verlag. Schultz, P. S., Ronen, S., Hattor,., Corbett, C., and P. antran, 994, Sesmc guded estmaton of log propertes, parts,, and 3: he Leadng Edge, 3, , do:0.90/.43700; , do:0.90/.43707; and , do:0.90/ urhan aner,., 00, Radal bass functon computaton of output layer neural eghts: Roc Sold Images, echncal Report. urhan aner,., J. S. Schuele, R. O Doherty, and E. Baysal, 994, Sesmc attrbutes revsted: 64th Annual Internatonal eetng, SEG, Expanded Abstracts, 04 06, do:0.90/ SEG SEG San Antono 0 Annual eetng 89

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

Mud-rock line estimation via robust locally weighted scattering smoothing method

Mud-rock line estimation via robust locally weighted scattering smoothing method Mud-roc lne va LOESS Mud-roc lne estmaton va robust locally weghted scatterng smoothng method A. Nassr Saeed, Laurence R. Lnes, and Gary F. Margrave ABSTRACT The robust locally weghted scatterng smoothng

More information

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Uncertainty as the Overlap of Alternate Conditional Distributions

Uncertainty as the Overlap of Alternate Conditional Distributions Uncertanty as the Overlap of Alternate Condtonal Dstrbutons Olena Babak and Clayton V. Deutsch Centre for Computatonal Geostatstcs Department of Cvl & Envronmental Engneerng Unversty of Alberta An mportant

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Chapter 11: Simple Linear Regression and Correlation

Chapter 11: Simple Linear Regression and Correlation Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests

More information

Radial-Basis Function Networks

Radial-Basis Function Networks Radal-Bass uncton Networs v.0 March 00 Mchel Verleysen Radal-Bass uncton Networs - Radal-Bass uncton Networs p Orgn: Cover s theorem p Interpolaton problem p Regularzaton theory p Generalzed RBN p Unversal

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

Non-linear Canonical Correlation Analysis Using a RBF Network

Non-linear Canonical Correlation Analysis Using a RBF Network ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6 Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

SDMML HT MSc Problem Sheet 4

SDMML HT MSc Problem Sheet 4 SDMML HT 06 - MSc Problem Sheet 4. The recever operatng characterstc ROC curve plots the senstvty aganst the specfcty of a bnary classfer as the threshold for dscrmnaton s vared. Let the data space be

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Hopfield Training Rules 1 N

Hopfield Training Rules 1 N Hopfeld Tranng Rules To memorse a sngle pattern Suppose e set the eghts thus - = p p here, s the eght beteen nodes & s the number of nodes n the netor p s the value requred for the -th node What ll the

More information

De-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG

De-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG 6th Internatonal Conference on Mechatroncs, Materals, Botechnology and Envronment (ICMMBE 6) De-nosng Method Based on Kernel Adaptve Flterng for elemetry Vbraton Sgnal of the Vehcle est Kejun ZEG PLA 955

More information

Adaptive RFID Indoor Positioning Technology for Wheelchair Home Health Care Robot. T. C. Kuo

Adaptive RFID Indoor Positioning Technology for Wheelchair Home Health Care Robot. T. C. Kuo Adaptve RFID Indoor Postonng Technology for Wheelchar Home Health Care Robot Contents Abstract Introducton RFID Indoor Postonng Method Fuzzy Neural Netor System Expermental Result Concluson -- Abstract

More information

Evaluation of classifiers MLPs

Evaluation of classifiers MLPs Lecture Evaluaton of classfers MLPs Mlos Hausrecht mlos@cs.ptt.edu 539 Sennott Square Evaluaton For any data set e use to test the model e can buld a confuson matrx: Counts of examples th: class label

More information

15-381: Artificial Intelligence. Regression and cross validation

15-381: Artificial Intelligence. Regression and cross validation 15-381: Artfcal Intellgence Regresson and cross valdaton Where e are Inputs Densty Estmator Probablty Inputs Classfer Predct category Inputs Regressor Predct real no. Today Lnear regresson Gven an nput

More information

Main Menu. characterization using surface reflection seismic data and sonic logs. Summary

Main Menu. characterization using surface reflection seismic data and sonic logs. Summary Stochastc Sesmc Inverson usng both Waveform and Traveltme Data and Its Applcaton to Tme-lapse Montorng Youl Quan* and Jerry M. Harrs, Geophyscs Department, Stanford Unversty Summary A stochastc approach

More information

Pattern Classification

Pattern Classification Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

Research Article Green s Theorem for Sign Data

Research Article Green s Theorem for Sign Data Internatonal Scholarly Research Network ISRN Appled Mathematcs Volume 2012, Artcle ID 539359, 10 pages do:10.5402/2012/539359 Research Artcle Green s Theorem for Sgn Data Lous M. Houston The Unversty of

More information

Neural Networks & Learning

Neural Networks & Learning Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred

More information

Statistics for Managers Using Microsoft Excel/SPSS Chapter 13 The Simple Linear Regression Model and Correlation

Statistics for Managers Using Microsoft Excel/SPSS Chapter 13 The Simple Linear Regression Model and Correlation Statstcs for Managers Usng Mcrosoft Excel/SPSS Chapter 13 The Smple Lnear Regresson Model and Correlaton 1999 Prentce-Hall, Inc. Chap. 13-1 Chapter Topcs Types of Regresson Models Determnng the Smple Lnear

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Statistics for Economics & Business

Statistics for Economics & Business Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable

More information

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

Statistics for Business and Economics

Statistics for Business and Economics Statstcs for Busness and Economcs Chapter 11 Smple Regresson Copyrght 010 Pearson Educaton, Inc. Publshng as Prentce Hall Ch. 11-1 11.1 Overvew of Lnear Models n An equaton can be ft to show the best lnear

More information

Lecture 6: Introduction to Linear Regression

Lecture 6: Introduction to Linear Regression Lecture 6: Introducton to Lnear Regresson An Manchakul amancha@jhsph.edu 24 Aprl 27 Lnear regresson: man dea Lnear regresson can be used to study an outcome as a lnear functon of a predctor Example: 6

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression 11 MACHINE APPLIED MACHINE LEARNING LEARNING MACHINE LEARNING Gaussan Mture Regresson 22 MACHINE APPLIED MACHINE LEARNING LEARNING Bref summary of last week s lecture 33 MACHINE APPLIED MACHINE LEARNING

More information

Lecture 9: Linear regression: centering, hypothesis testing, multiple covariates, and confounding

Lecture 9: Linear regression: centering, hypothesis testing, multiple covariates, and confounding Lecture 9: Lnear regresson: centerng, hypothess testng, multple covarates, and confoundng Sandy Eckel seckel@jhsph.edu 6 May 008 Recall: man dea of lnear regresson Lnear regresson can be used to study

More information

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute

More information

LECTURE 9 CANONICAL CORRELATION ANALYSIS

LECTURE 9 CANONICAL CORRELATION ANALYSIS LECURE 9 CANONICAL CORRELAION ANALYSIS Introducton he concept of canoncal correlaton arses when we want to quantfy the assocatons between two sets of varables. For example, suppose that the frst set of

More information

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH Turbulence classfcaton of load data by the frequency and severty of wnd gusts Introducton Oscar Moñux, DEWI GmbH Kevn Blebler, DEWI GmbH Durng the wnd turbne developng process, one of the most mportant

More information

Lecture 9: Linear regression: centering, hypothesis testing, multiple covariates, and confounding

Lecture 9: Linear regression: centering, hypothesis testing, multiple covariates, and confounding Recall: man dea of lnear regresson Lecture 9: Lnear regresson: centerng, hypothess testng, multple covarates, and confoundng Sandy Eckel seckel@jhsph.edu 6 May 8 Lnear regresson can be used to study an

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Uncertainty and auto-correlation in. Measurement

Uncertainty and auto-correlation in. Measurement Uncertanty and auto-correlaton n arxv:1707.03276v2 [physcs.data-an] 30 Dec 2017 Measurement Markus Schebl Federal Offce of Metrology and Surveyng (BEV), 1160 Venna, Austra E-mal: markus.schebl@bev.gv.at

More information

Constructing Control Process for Wafer Defects Using Data Mining Technique

Constructing Control Process for Wafer Defects Using Data Mining Technique The Fourth nternatonal Conference on Electronc Busness (CEB004) / Bejng 5 Constructng Control ocess for Wafer Defects Usng Data Mnng Technque Leeng Tong Hsngyn Lee Chfeng Huang Changke Ln Chenhu Yang Department

More information

/ n ) are compared. The logic is: if the two

/ n ) are compared. The logic is: if the two STAT C141, Sprng 2005 Lecture 13 Two sample tests One sample tests: examples of goodness of ft tests, where we are testng whether our data supports predctons. Two sample tests: called as tests of ndependence

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems

More information

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1 On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool

More information

is the calculated value of the dependent variable at point i. The best parameters have values that minimize the squares of the errors

is the calculated value of the dependent variable at point i. The best parameters have values that minimize the squares of the errors Multple Lnear and Polynomal Regresson wth Statstcal Analyss Gven a set of data of measured (or observed) values of a dependent varable: y versus n ndependent varables x 1, x, x n, multple lnear regresson

More information

Solving Nonlinear Differential Equations by a Neural Network Method

Solving Nonlinear Differential Equations by a Neural Network Method Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,

More information

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng

More information

b ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere

b ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere Fall Analyss of Epermental Measurements B. Esensten/rev. S. Errede Some mportant probablty dstrbutons: Unform Bnomal Posson Gaussan/ormal The Unform dstrbuton s often called U( a, b ), hch stands for unform

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function Advanced Scence and Technology Letters, pp.83-87 http://dx.do.org/10.14257/astl.2014.53.20 A Partcle Flter Algorthm based on Mxng of Pror probablty densty and UKF as Generate Importance Functon Lu Lu 1,1,

More information

Explaining the Stein Paradox

Explaining the Stein Paradox Explanng the Sten Paradox Kwong Hu Yung 1999/06/10 Abstract Ths report offers several ratonale for the Sten paradox. Sectons 1 and defnes the multvarate normal mean estmaton problem and ntroduces Sten

More information

p 1 c 2 + p 2 c 2 + p 3 c p m c 2

p 1 c 2 + p 2 c 2 + p 3 c p m c 2 Where to put a faclty? Gven locatons p 1,..., p m n R n of m houses, want to choose a locaton c n R n for the fre staton. Want c to be as close as possble to all the house. We know how to measure dstance

More information

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k) ISSN 1749-3889 (prnt), 1749-3897 (onlne) Internatonal Journal of Nonlnear Scence Vol.17(2014) No.2,pp.188-192 Modfed Block Jacob-Davdson Method for Solvng Large Sparse Egenproblems Hongy Mao, College of

More information

Relevance Vector Machines Explained

Relevance Vector Machines Explained October 19, 2010 Relevance Vector Machnes Explaned Trstan Fletcher www.cs.ucl.ac.uk/staff/t.fletcher/ Introducton Ths document has been wrtten n an attempt to make Tppng s [1] Relevance Vector Machnes

More information

Which Separator? Spring 1

Which Separator? Spring 1 Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

Automatic Object Trajectory- Based Motion Recognition Using Gaussian Mixture Models

Automatic Object Trajectory- Based Motion Recognition Using Gaussian Mixture Models Automatc Object Trajectory- Based Moton Recognton Usng Gaussan Mxture Models Fasal I. Bashr, Ashfaq A. Khokhar, Dan Schonfeld Electrcal and Computer Engneerng, Unversty of Illnos at Chcago. Chcago, IL,

More information

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger JAB Chan Long-tal clams development ASTIN - September 2005 B.Verder A. Klnger Outlne Chan Ladder : comments A frst soluton: Munch Chan Ladder JAB Chan Chan Ladder: Comments Black lne: average pad to ncurred

More information

MAXIMUM A POSTERIORI TRANSDUCTION

MAXIMUM A POSTERIORI TRANSDUCTION MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,

More information

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

2016 Wiley. Study Session 2: Ethical and Professional Standards Application 6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton

More information

Chapter 4. Velocity analysis

Chapter 4. Velocity analysis 1 Chapter 4 Velocty analyss Introducton The objectve of velocty analyss s to determne the sesmc veloctes of layers n the subsurface. Sesmc veloctes are used n many processng and nterpretaton stages such

More information

[The following data appear in Wooldridge Q2.3.] The table below contains the ACT score and college GPA for eight college students.

[The following data appear in Wooldridge Q2.3.] The table below contains the ACT score and college GPA for eight college students. PPOL 59-3 Problem Set Exercses n Smple Regresson Due n class /8/7 In ths problem set, you are asked to compute varous statstcs by hand to gve you a better sense of the mechancs of the Pearson correlaton

More information

Report on Image warping

Report on Image warping Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.

More information

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland

More information

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14 APPROXIMAE PRICES OF BASKE AND ASIAN OPIONS DUPON OLIVIER Prema 14 Contents Introducton 1 1. Framewor 1 1.1. Baset optons 1.. Asan optons. Computng the prce 3. Lower bound 3.1. Closed formula for the prce

More information

APPENDIX F A DISPLACEMENT-BASED BEAM ELEMENT WITH SHEAR DEFORMATIONS. Never use a Cubic Function Approximation for a Non-Prismatic Beam

APPENDIX F A DISPLACEMENT-BASED BEAM ELEMENT WITH SHEAR DEFORMATIONS. Never use a Cubic Function Approximation for a Non-Prismatic Beam APPENDIX F A DISPACEMENT-BASED BEAM EEMENT WITH SHEAR DEFORMATIONS Never use a Cubc Functon Approxmaton for a Non-Prsmatc Beam F. INTRODUCTION { XE "Shearng Deformatons" }In ths appendx a unque development

More information

Y = β 0 + β 1 X 1 + β 2 X β k X k + ε

Y = β 0 + β 1 X 1 + β 2 X β k X k + ε Chapter 3 Secton 3.1 Model Assumptons: Multple Regresson Model Predcton Equaton Std. Devaton of Error Correlaton Matrx Smple Lnear Regresson: 1.) Lnearty.) Constant Varance 3.) Independent Errors 4.) Normalty

More information

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING 1 ADVANCED ACHINE LEARNING ADVANCED ACHINE LEARNING Non-lnear regresson technques 2 ADVANCED ACHINE LEARNING Regresson: Prncple N ap N-dm. nput x to a contnuous output y. Learn a functon of the type: N

More information

Finite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin

Finite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin Fnte Mxture Models and Expectaton Maxmzaton Most sldes are from: Dr. Maro Fgueredo, Dr. Anl Jan and Dr. Rong Jn Recall: The Supervsed Learnng Problem Gven a set of n samples X {(x, y )},,,n Chapter 3 of

More information

3D petrophysical modeling - 1-3D Petrophysical Modeling Usning Complex Seismic Attributes and Limited Well Log Data

3D petrophysical modeling - 1-3D Petrophysical Modeling Usning Complex Seismic Attributes and Limited Well Log Data - 1-3D Petrophysical Modeling Usning Complex Seismic Attributes and Limited Well Log Data Mehdi Eftekhari Far, and De-Hua Han, Rock Physics Lab, University of Houston Summary A method for 3D modeling and

More information

Classification as a Regression Problem

Classification as a Regression Problem Target varable y C C, C,, ; Classfcaton as a Regresson Problem { }, 3 L C K To treat classfcaton as a regresson problem we should transform the target y nto numercal values; The choce of numercal class

More information

T E C O L O T E R E S E A R C H, I N C.

T E C O L O T E R E S E A R C H, I N C. T E C O L O T E R E S E A R C H, I N C. B rdg n g En g neern g a nd Econo mcs S nce 1973 THE MINIMUM-UNBIASED-PERCENTAGE ERROR (MUPE) METHOD IN CER DEVELOPMENT Thrd Jont Annual ISPA/SCEA Internatonal Conference

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

An identification algorithm of model kinetic parameters of the interfacial layer growth in fiber composites

An identification algorithm of model kinetic parameters of the interfacial layer growth in fiber composites IOP Conference Seres: Materals Scence and Engneerng PAPER OPE ACCESS An dentfcaton algorthm of model knetc parameters of the nterfacal layer growth n fber compostes o cte ths artcle: V Zubov et al 216

More information

Unified Subspace Analysis for Face Recognition

Unified Subspace Analysis for Face Recognition Unfed Subspace Analyss for Face Recognton Xaogang Wang and Xaoou Tang Department of Informaton Engneerng The Chnese Unversty of Hong Kong Shatn, Hong Kong {xgwang, xtang}@e.cuhk.edu.hk Abstract PCA, LDA

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

Cokriging Partial Grades - Application to Block Modeling of Copper Deposits

Cokriging Partial Grades - Application to Block Modeling of Copper Deposits Cokrgng Partal Grades - Applcaton to Block Modelng of Copper Deposts Serge Séguret 1, Julo Benscell 2 and Pablo Carrasco 2 Abstract Ths work concerns mneral deposts made of geologcal bodes such as breccas

More information

Linear regression. Regression Models. Chapter 11 Student Lecture Notes Regression Analysis is the

Linear regression. Regression Models. Chapter 11 Student Lecture Notes Regression Analysis is the Chapter 11 Student Lecture Notes 11-1 Lnear regresson Wenl lu Dept. Health statstcs School of publc health Tanjn medcal unversty 1 Regresson Models 1. Answer What Is the Relatonshp Between the Varables?.

More information

Chapter 15 Student Lecture Notes 15-1

Chapter 15 Student Lecture Notes 15-1 Chapter 15 Student Lecture Notes 15-1 Basc Busness Statstcs (9 th Edton) Chapter 15 Multple Regresson Model Buldng 004 Prentce-Hall, Inc. Chap 15-1 Chapter Topcs The Quadratc Regresson Model Usng Transformatons

More information

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB

More information

Natural Language Processing and Information Retrieval

Natural Language Processing and Information Retrieval Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support

More information

System identifications by SIRMs models with linear transformation of input variables

System identifications by SIRMs models with linear transformation of input variables ORIGINAL RESEARCH System dentfcatons by SIRMs models wth lnear transformaton of nput varables Hrofum Myama, Nortaka Shge, Hrom Myama Graduate School of Scence and Engneerng, Kagoshma Unversty, Japan Receved:

More information

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material Natural Images, Gaussan Mxtures and Dead Leaves Supplementary Materal Danel Zoran Interdscplnary Center for Neural Computaton Hebrew Unversty of Jerusalem Israel http://www.cs.huj.ac.l/ danez Yar Wess

More information

Transient Stability Assessment of Power System Based on Support Vector Machine

Transient Stability Assessment of Power System Based on Support Vector Machine ransent Stablty Assessment of Power System Based on Support Vector Machne Shengyong Ye Yongkang Zheng Qngquan Qan School of Electrcal Engneerng, Southwest Jaotong Unversty, Chengdu 610031, P. R. Chna Abstract

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

International Journal of Pure and Applied Sciences and Technology

International Journal of Pure and Applied Sciences and Technology Int. J. Pure Appl. Sc. Technol., 4() (03), pp. 5-30 Internatonal Journal of Pure and Appled Scences and Technology ISSN 9-607 Avalable onlne at www.jopaasat.n Research Paper Schrödnger State Space Matrx

More information

Chapter 15 - Multiple Regression

Chapter 15 - Multiple Regression Chapter - Multple Regresson Chapter - Multple Regresson Multple Regresson Model The equaton that descrbes how the dependent varable y s related to the ndependent varables x, x,... x p and an error term

More information

Basic Business Statistics, 10/e

Basic Business Statistics, 10/e Chapter 13 13-1 Basc Busness Statstcs 11 th Edton Chapter 13 Smple Lnear Regresson Basc Busness Statstcs, 11e 009 Prentce-Hall, Inc. Chap 13-1 Learnng Objectves In ths chapter, you learn: How to use regresson

More information

Multiple Sound Source Location in 3D Space with a Synchronized Neural System

Multiple Sound Source Location in 3D Space with a Synchronized Neural System Multple Sound Source Locaton n D Space wth a Synchronzed Neural System Yum Takzawa and Atsush Fukasawa Insttute of Statstcal Mathematcs Research Organzaton of Informaton and Systems 0- Mdor-cho, Tachkawa,

More information

x i1 =1 for all i (the constant ).

x i1 =1 for all i (the constant ). Chapter 5 The Multple Regresson Model Consder an economc model where the dependent varable s a functon of K explanatory varables. The economc model has the form: y = f ( x,x,..., ) xk Approxmate ths by

More information

Robert Eisberg Second edition CH 09 Multielectron atoms ground states and x-ray excitations

Robert Eisberg Second edition CH 09 Multielectron atoms ground states and x-ray excitations Quantum Physcs 量 理 Robert Esberg Second edton CH 09 Multelectron atoms ground states and x-ray exctatons 9-01 By gong through the procedure ndcated n the text, develop the tme-ndependent Schroednger equaton

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

Application research on rough set -neural network in the fault diagnosis system of ball mill

Application research on rough set -neural network in the fault diagnosis system of ball mill Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(4):834-838 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 Applcaton research on rough set -neural network n the

More information