Data Mining in Petroleum Upstream The Use of Regression and Classification Algorithms

Size: px
Start display at page:

Download "Data Mining in Petroleum Upstream The Use of Regression and Classification Algorithms"

Transcription

1 Data Mnng n Petroleum Upstream The Use of Regresson and Classfcaton Algorthms Dawe L #, Guangren Sh Research Insttute of Petroleum Exploraton & Development, PetroChna, Bejng 00083, Chna Emal: leedw@petrochna.com.cn Abstract Scentfc Journal of Earth Scence June 205, Volume 5, Issue 2, PP Data mnng (DM) technque has seen enormous successes n some felds, but ts applcaton to petroleum upstream (PUP) s stll at ntal stage, as PUP s qute dfferent from the other felds n many aspects. The most popular DM algorthms n PUP are regresson and classfcaton. Through many DM applcatons to PUP, we have found that: a) the preferable algorthm for regresson s back-propagaton neural network (BPNN), the next are regresson of support vector machne (R-SVM) and multple regresson analyss (MRA), and the preferable algorthm for classfcaton s classfcaton of support vector machne (C-SVM), the next s Bayesan successve dscrmnaton (BAYSD); b) C-SVM can also be appled n data cleanng; c) both MRA and BAYSD can also be appled n dmenson-reducton, and BAYSD s preferable one; d) R-mode cluster analyss (RCA) can be appled n dmenson-reducton, whle Q-mode cluster analyss (QCA) can be appled n sample-reducton. A case study n PUP ndcates that R-SVM, BPNN and MRA are not applcable for regresson, whereas C-SVM s applcable for classfcaton n ths case. Keywords: Data Mnng; Regresson; Classfcaton; Data Cleanng; Dmenson-reducton; Sample-reducton; Ol Layer INTRODUCTION Petroleum upstream (PUP) has already entered bg data epoch for many years. For example, there are about 50 large nformaton systems that have been bult n PetroChna, and there are about 590TB PUP data and wells structured data n only one of them. At present, the major applcatons of PUP data n these nformaton systems are only nformaton storage and nqury, whch s far away from fully utlzng the values of these data assets. Data mnng (DM) s a preferable soluton for ths problem. DM technque has seen enormous successes n some felds of busness and scences, but ts applcaton to PUP s stll at ntal stage. Ths s because that PUP s dfferent from the other felds n many aspects, such as mscellaneous data types, huge quantty, dfferent measurng precson, and lots of uncertantes to data mnng results. DM can be dvded nto two classes: small data mnng (SDM), whch deals wth several thousand samples at most; and bg data mnng (BDM), whch deals wth ten thousand samples at least. In PUP, the sesmc, remote sensng and well log data are potental applcatons of BDM, e.g. Zhu and Sh (203) and Sh et al. (204) used BDM n well log[][2]; but the other data are potental applcatons of SDM, e.g. Sh (203) used SDM n 36 case studes[3], and Sh (205) used SDM n 8 case studes of petroleum geology[4]. In DM, the most popular algorthms are regresson and classfcaton. There manly exst three regresson and fve classfcaton algorthms[3]: the multple regresson analyss (MRA), the error back-propagaton neural network (BPNN), the regresson of support vector machne (R-SVM), the classfcaton of support vector machne (C-SVM), the decson trees (DTR), the Naïve Bayesan (NBAY), the Bayesan dscrmnaton (BAYD), and the Bayesan successve dscrmnaton (BAYSD). These eght algorthms use the same known parameters, and also share the same unknown that s predcted. The only dfference between them s the method and calculaton results. It s well known that MRA, BPNN and R-SVM are regresson algorthms wth real number results, whle C-SVM, DTR, NBAY, BAYD and BAYSD are classfcaton algorthms wth nteger number results. In the eght algorthms, only MRA s a lnear algorthm whereas the other seven are nonlnear algorthms, ths s due to the fact that MRA

2 constructs a lnear functon whereas the other seven algorthms construct nonlnear functons, respectvely. Snce the usage of DTR s qute complcated and BAYSD s superor to BAYD, these two classfcaton algorthms (DTR and BAYD) are not ntroduced n ths paper. 2 REGRESSION AND CLASSIFICATION ALGORITHMS Assume that there are n learnng samples, each assocated wth m+ numbers (x, x 2,, x m, y ) and a set of observed values (x, x 2,, x m, y ), wth =, 2,, n for these numbers. In prncple, n>m, but n actual practce n>>m. The n samples assocated wth m+ numbers are defned as n vectors: x = (x, x 2,, x m, y ) (=, 2,, n) () where, n s the number of learnng samples; m s the number of ndependent varables n samples; x s the th learnng sample vector; x j s the value of the j th ndependent varable n the th learnng sample, j=, 2,, m; and y s the value of the th learnng sample, the observed value. Equaton () s the expresson of learnng samples. Let x 0 be the general form of a vector of (x, x 2,, x m ). The prncples of MRA, BPNN, NBAY and BAYSD are the same,.e. try to construct an expresson, y=y(x 0 ), such that Equaton (2) s mnmzed. Certanly, these three dfferent algorthms use dfferent approaches and result n dfferent accuracy of calculaton results. n y = ( x ) 0 y where, y( x 0 ) s the calculaton result of the dependent varable n the th learnng sample; and the other symbols have been defned n Equaton (). However, the prncples of R-SVM and C-SVM algorthms are to try to construct an expresson, y=y(x 0 ), n order to maxmze the margn based on support vector ponts to obtan the optmal separatng lne. Ths y=y(x 0 ) s called the fttng formula obtaned n the learnng process. The fttng formulas of dfferent algorthms are dfferent. In ths paper, y s defned as a sngle varable. The flowchart s as follows: the st step s the learnng process, usng n learnng samples to obtan a fttng formula; the 2 nd step s the learnng valdaton, substtutng n learnng samples nto the fttng formula to get predcton values (y, y 2,, y n ), respectvely, so as to verfy the ftness of an algorthm; and the 3 rd step s the predcton process, substtutng k predcton samples expressed wth Equaton (3) nto the fttng formula to get predcton values (y n+, y n+2,, y n+k ), respectvely. 2 x = (x, x 2,, x m ) (=n+, n+2,, n+k) (3) where, k s the number of predcton samples; x s the th predcton sample vector; and the other symbols have been defned n Equaton (). Equaton (3) s the expresson of predcton samples. 2. Error Analyss of Calculaton Results To express the calculaton accuracy of the predcton varable y for learnng and predcton samples when the aforementoned sx algorsms (MRA, BPNN, R-SVM, C-SVM, NBAY, and BAYSD) are used, the absolute relatve resdual R(%), the mean absolute relatve resdual R(%) and the total mean absolute relatve resdual R (%) are adopted. The absolute relatve resdual for each sample, R(%), s defned as R(%) = ( y - y ) / y 00 (4) where, y s the calculaton result of the dependent varable n the th learnng sample; and the other symbols have (2)

3 been defned n Equatons () and (3). It s noted that zero must not be taken as a value of y to avod floatng-pont overflow. Therefore, for regresson algorthm, delete the sample f ts y =0; and for classfcaton algorthm, postve nteger s taken as values of y. The mean absolute relatve resdual for all learnng samples or predcton samples, R(%), s defned as Ns R R(%) / Ns (5) (%)= = where, N s =n for learnng samples whle N s = k for predcton samples; and the other symbols have been defned n Equatons () and (3). For learnng samples, R(%) and R(%) are called the fttng resdual to express the ftness of learnng process, and here R(%) s desgnated as R(%); and for predcton samples, R(%) and R(%) are called the predcton resdual to express the accuracy of predcton process, and here R(%) s desgnated as R 2(%). The total mean absolute relatve resdual for all samples, R (%), s defned as R (%) = [ R(%)+ R 2 (%)]/2 (6) When there are no predcton samples, R (%)= R(%). 2.2 Nonlnearty and Soluton Accuracy of Studed Problem Snce MRA s a lnear algorthm, ts R (%) for a studed problem expresses the nonlnearty of y=y(x) to be solved,.e. the nonlnearty of the studed problem. Whether lnear algorthm (MRA) or nonlnear algorthms (BPNN, R-SVM, C-SVM, NBAY and BAYSD), ther R (%) of a studed problem expresses the accuracy of y=y(x) obtaned by each algorthm,.e. soluton accuracy of the studed problem solved by each algorthm. y = y(x) created by BPNN s an mplct expresson (t cannot be expressed as a usual mathematcal formula); whereas that of the other four algorthms are explct expressons (they are expressed as a usual mathematcal formula). 3 DATA PREPROCESSING The methods of data preprocessng are varous (such as data cleanng, data ntegraton, data transformaton and data reducton etc.). These methods wll be used before data mnng. Mamon and Rokach (200) made more detaled summary to data preprocessng[5], Han and Kamber (2006) specfed more detaled methods related to data preprocessng[6]. Ths paper only dscusses the aforementoned sx algorthms (MRA, BPNN, R-SVM, C-SVM, NBAY and BAYSD) used n data cleanng and data reducton. 3. Data Cleanng Realstc data are often nosy, mperfect and nconsstent. The man job for data cleanng s to fll up the mssed data, make nosy data smooth, dentfy or elmnate abnormal data as well as solve nconsstent problems. For example, n the case study of Zhu and Sh (203) and Sh et al. (204), C-SVM was employed to 2242 leanng samples and t was found that R(%) of 27 leanng samples were not zero. By correctng y of the 27 samples, C- SVM run agan resultng R(%)=0[][2]. 3.2 Data Reducton It usually takes lots of tme to make analyss of the complcated data to the large scaled DB content, whch often makes ths analyss unrealstc and unfeasble, partcularly nteractve DM s needed. Data reducton technque s

4 used to obtan a smplfed data set from the orgnal huge data set, and keep ths smplfed data set ntegraton as the orgnal data set. In ths way, t s obvously more effcent to perform DM on the smplfed data set, and the mned results are almost the same as the results from the orgnal data set. The man strateges to do data reducton are: a) data aggregaton, such as buld up data cube aggregaton, ths aggregaton s manly used for the constructon of data cube data warehouse operaton, b) data reducton s manly used for the detecton and elmnaton of those unrelated, weak correlated or redundant attrbutes or dmensons, c) data compresson, to compress data set by usng codng technque (e.g. mnmum codng length or wavelet); and d) numerosty reducton, to use smpler data expresson mode (e.g. parameter model, non-parameter model clusterng, samplng and hstogram ) to dsplace the orgnal data. Here we only dscuss the dmenson-reducton whch can reduce the number of ndependent varables and samplereducton whch can reduce the number of samples. MRA and BAYSD can serve as poneerng dmenson-reducton tools[3], because the two algorthms both can gve the dependence of the predcted value (y) on ndependent varables (x, x 2,, x m ), n decreasng order. However, because MRA belongs to data analyss n lnear correlaton whereas BAYSD s n nonlnear correlaton, n the applcatons wth very strong nonlnearty the ablty of dmenson-reducton of BAYSD s hgher than that of MRA. For example, the case study of Zhu and Sh (203) and Sh et al. (204) s a 6-D problem (x, x 2,, x 5, y)[][2]. For MRA, x s the mnmum dependence of y, so we tred to delete x and run C-SVM, the results shows R(%)=0.036 and R 2(%)=8.3, but the results wthout ths deleton are R(%)=0 and R 2(%)=6.30, whch ndcates ths dmenson-reducton s faled. For BAYSD, however, though t runs n the condton wthout x 3, x 7, x 6, x 4 x 2, x 3, x 2 and x 9, ts R (%) s 6.8, whch shows the dmenson of ths studed problem can be reduced from 6-D to 8- D. Why s the dmenson-reducton of BAYSD successful but that of MRA faled? The reason s that the nonlnearty of the studed problem s strong due to R (%) of MRA s 52.4, and BAYSD s a nonlnear algorthm whle MRA s a lnear algorthm. In addton, R-mode cluster analyss (RCA) can serve as a poneerng dmenson-reducton tool, whle Q-mode cluster analyss (QCA) can serve as a poneerng sample-reducton tool[3]. It s noted that both RCA and QCA are lnear algorthms. The called poneerng tool s whether t succeeds or not needs nonlnear tool (BPNN for regresson problem, C- SVM for classfcaton problem) for the second valdaton, so as to determne how many samples and how many ndependent varables can be reduced. Why does t need second valdaton? Because of the complextes of geoscences rules, the correlatons between dfferent classes of geoscences data are nonlnear n most cases. 4 CASE STUDY: OIL LAYER EVALUATION 4. Studed Problem The objectve of ths case study s to conduct the predcton of formaton flow capacty, whch has practcal value when ol test data are less lmted. Usng data of 4 samples from the Tahe Olfeld n the Tarm Basn, Chna, and each sample contans 7 ndependent varables (x = ol vscosty, x 2 = ol outcome, x 3 = choke sze, x 4 = ol pressure, x 5 = ol densty, x 6 = gas/ol rato, and x 7 = water cut) and an ol test result (y = formaton flow capacty), Kang et al. (2007) adopted BPNN for the predcton of formaton flow capacty[7]. Actually, they only adopted 6 ndependent varables wthout x (ol vscosty) for BPNN. In our case study, among these 4 samples, 2 are taken as learnng samples and one as predcton sample wth 7 ndependent varables (Table ). 4.2 Input Known Parameters They are the values of the known varables x (=, 2, 3, 4, 5, 6, 7) for 2 learnng samples and one predcton sample, and the value of the predcton varable y for 2 learnng samples (Table ). Note: y = formaton flow capacty for regresson calculaton; y = ol layer classfcaton for classfcaton calculaton

5 Sample type Learnng samples Predcton sample TABLE INPUT DATA FOR FORMATION FLOW CAPACITY OF THE TAHE OILFIELD Sample No. Well No. Relatve parameters for formaton flow capacty a y x x 2 x 3 x 4 x 5 x 6 x 7 FFC b OLC c TK S TK TK TK S S TK TK TK TK TK TK (4.80) (3) a x = ol vscosty, mpa s; x 2 = ol outcome, t/d; x 3 = choke sze, mm; x 4 = ol pressure, MPa; x 5 = ol densty, g/cm 3 ; x 6 = gas/ol rato; and x 7 = water cut, %. b y = formaton-flow-capacty (FFC, /(m µm 2 )) or ol-layer-classfcaton (OLC) determned by the ol test; number n parenthess s not nput data, but s used for calculatng R(%). c when y =OLC, t s ol layer classfcaton ( hgh-productvty ol layer, 2 ntermedate-productvty ol layer, 3 low-productvty ol layer, 4 dry layer) determned by the ol test (Table 2). TABLE 2 OIL LAYER CLASSIFICATION BASED ON FORMATION FLOW CAPACITY Ol layer classfcaton Formaton flow capacty (/(m µm 2 )) Hgh-productvty ol layer >00 Intermedate-productvty ol layer (25, 00] 2 Low-productvty ol layer [4, 25] Regresson Calculaton R-SVM, BPNN and MRA are adopted for regresson calculaton. ) Learnng Process Usng the 2 learnng samples (Table ) and by R-SVM, BPNN and MRA, the three functons of formaton flow capacty (y) wth respect to 7 ndependent varables (x, x 2, x 3, x 4, x 5, x 6, x 7 ) have been constructed. Usng R-SVM[8][3], BPNN[3] and MRA[3], the result s fttng formula, respectvely. Substtutng the values of x, x 2, x 3, x 4, x 5, x 6, and x 7 gven by the 2 learnng samples (Table ) n the fttng formula of R-SVM, BPNN and MRA, respectvely, the formaton flow capacty (y) of each learnng sample s obtaned. Table 3 shows the results of learnng process by R-SVM, BPNN and MRA. 2) Predcton Process Substtutng the values of x, x 2, x 3, x 4, x 5, x 6, and x 7 gven by the one predcton sample (Table ) n the fttng formula of R-SVM, BPNN and MRA, respectvely, the formaton flow capacty (y) of the predcton sample s obtaned. Table 3 shows the results of predcton process by R-SVM, BPNN and MRA. It can been seen from R (%)=56.65 of MRA (Table 4) that the nonlnearty of the relatonshp between the predcted value y and ts relatve ndependent varables (x, x 2, x 3, x 4, x 5, x 6, x 7 ) s very strong. The soluton accuracy of R-SVM and MRA are very low, and the soluton accuracy of BPNN s moderate. Snce ths case study s a very strong nonlnear problem, R-SVM, BPNN and MRA are not applcable y

6 Sample type Learnng samples Predcton sample TABLE 3 PREDICTION RESULTS OF FORMATION FLOW CAPACITY OF THE TAHE OILFIELD Formaton flow capacty (/(m µm 2 )) Sample Regresson algorthm Ol test No. R-SVM BPNN MRA y y R(%) y R(%) y R(%) TABLE 4 Algorthm R-SVM BPNN MRA COMPARISON AMONG THE APPLICATIONS OF THREE REGRESSION ALGORITHMS TO FORMATION FLOW CAPACITY OF THE TAHE OILFIELD Fttng formula Nonlnear, explct Nonlnear, mplct Lnear, explct 4.4 Classfcaton Calculaton Mean absolute relatve resdual R (%) R 2(%) R (%) Dependence of the predcted value (y) on ndependent varables (x, x 2, x 3, x 4, x 5, x 6, x 7 ), n decreasng order Tme consumng on PC (Intel Core 2) Soluton accuracy N/A 3 s Very low N/A 30 s Moderate x 4, x 3, x 5, x 6, x, x 2, x 7 < s Very low C-SVM, NBAY, BAYSD and MRA are adopted for classfcaton calculaton. ) Learnng Process Usng the 2 learnng samples (Table ) and by C-SVM, NBAY, BAYSD and MRA, the four functons of ol-layerclassfcaton (y) wth respect to 7 ndependent varables (x, x 2, x 3, x 4, x 5, x 6, x 7 ) have been constructed. Usng C-SVM[8][3], NBAY[3], BAYSD[3] and MRA[3], the result s fttng formula, respectvely. Substtutng the values of x, x 2, x 3, x 4, x 5, x 6, and x 7 gven by the 2 learnng samples (Table ) n the fttng formula of C-SVM, NBAY, BAYSD and MRA, respectvely, the ol-layer-classfcaton (y) of each learnng sample s obtaned. Table 5 shows the results of learnng process by C-SVM, NBAY, BAYSD and MRA. 2) Predcton Process Substtutng the values of x, x 2, x 3, x 4, x 5, x 6, and x 7 gven by the one predcton sample (Table ) n the fttng formula of C-SVM, NBAY, BAYSD and MRA respectvely, the ol-layer-classfcaton (y) of the predcton sample s obtaned. Table 5 shows the results of predcton process by C-SVM, NBAY, BAYSD and MRA. It can been seen from R (%)=8.8 of MRA (Table 6) that the nonlnearty of the relatonshp between the predcted value y and ts relatve ndependent varables (x, x 2, x 3, x 4, x 5, x 6, x 7 ) s strong. The results of C-SVM are very hgh,.e. not only the R(%)=0, but also R 2 (%)=0, and thus the total mean absolute relatve resdual R (%)=0, whch concdes wth practcalty. Snce ths case study s a strong nonlnear problem, only C-SVM s applcable, but NBAY and BAYSD are not applcable due to very low soluton accuracy

7 Sample type Learnng samples Predcton sample TABLE 5 PREDICTION RESULTS FROM OIL LAYER CLASSIFICATION OF THE TAHE OILFIELD Ol layer classfcaton Sample Ol test a Classfcaton algorthm No. C-SVM NBAY BAYSD MRA b y y R(%) y R(%) y R(%) y R(%) a y = ol-layer-classfcaton ( hgh-productvty ol layer, 2 ntermedate-productvty ol layer, 3 low-productvty ol layer, 4 dry layer) determned by the ol test (Table 2). b Here y calculated by MRA are converted from real number to nteger number by usng round rule. TABLE 6 COMPARISON AMONG THE APPLICATIONS OF FOUR CLASSIFICATION ALGORITHMS TO OIL LAYER CLASSIFICATION OF THE TAHE OILFIELD Algorthm C-SVM NBAY BAYSD MRA 4.5 Summary Fttng formula Nonlnear, explct Nonlnear, explct Nonlnear, explct Lnear, explct Mean absolute relatve resdual R (%) R 2(%) R (%) Dependence of the predcted value (y) on ndependent varables (x, x 2, x 3, x 4, x 5, x 6, x 7 ), n decreasng order Tme consumng on PC (Intel Core 2) Soluton accuracy N/A 5 s Very hgh N/A < s Very low x, x 7, x 4, x 3, x 5, x 6, x 2 s Very low x 2, x 4, x, x 5, x 3, x 6, x 7 < s Strong nonlnearty Summarly, usng data for the formaton flow capacty and ol layer classfcaton n the Tahe Olfeld based on seven ndependent varables (ol vscosty, ol outcome, glb, ol pressure, ol densty, gas/ol rato, and water cut) and an ol test result of 3 samples, of whch 2 are taken as learnng samples and one s taken as predcton sample, regresson algorthms (R-SVM, BPNN, MRA) are adopted for the formaton flow capacty, and classfcaton algorthms (C-SVM, NBAY and BAYSD) are used for the ol layer classfcaton. It s found that a) snce the formaton flow capacty s a very strong nonlnear problem, R-SVM, BPNN and MRA are not applcable; and b) snce the ol layer classfcaton s a strong nonlnear problem, only C-SVM s applcable, but NBAY and BAYSD are not applcable due to very low soluton accuracy. 5 CONCLUSIONS DM s a useful technque for utlzng the values of data asset n PUP. Through many DM applcatons to PUP[][2] [3][4], we have found that: a) the preferable algorthm for regresson problems s BPNN, the next s R-SVM and MRA, and the preferable algorthm for classfcaton problems s C-SVM, the next s BAYSD; b) C-SVM also can be appled n data cleanng; c) both MRA and BAYSD also can be appled n dmenson-reducton, and BAYSD s preferable one; and d) R-mode cluster analyss (RCA) can be appled n dmenson-reducton, whle Q-mode cluster analyss (QCA) can be appled n sample-reducton. It s noted that both RCA and QCA are lnear algorthms. Fnally, a case study (ol layer evaluaton) ndcates that n ths case R-SVM, BPNN and MRA are not applcable for

8 regresson, whereas C-SVM s applcable for classfcaton. REFERENCES [] Zhu, Y., Sh, G. Identfcaton of lthologc characterstcs of volcanc rocks by support vector machne. Acta Petrole Snca, 34 (2), , 203(n Chnese) [2] Sh, G., Zhu, Y., M, S., Ma J., Wan J. A Bg Data Mnng n Petroleum Exploraton and Development. CSCanada, 7(2), -8, 204 [3] Sh, G. Data Mnng and Knowledge Dscovery for Geoscentsts. Elsever Inc., USA, 203 [4] Sh G. Optmal predcton n petroleum geology by regresson and classfcaton methods. Sc J Inf Eng 5(2), 4-32, 205 [5] Mamon, O., Rokach, L. The Data Mnng and Knowledge Dscovery Handbook, 2 nd edton. Sprnger, New York, NY, USA,200 [6] Han, J.W., Kamber, M. Data Mnng: Concepts and Technques, 2 nd edton. Morgan Kaufmann, San Francsco, CA, USA, 2006 [7] Kang, Z., Guo, C., Wu, W. Technque of dynamc descrptons to the crack and cave carbonate rock reservor n the Tahe ol feld, Xnjang, Chna. Journal of Chengdu Unversty of Technology (Scence & Technology Edton), 34 (2), 43-46, 2007(n Chnese) [8] Chang, C., Ln, C. LIBSVM: a lbrary for support vector machnes, Verson 3.. Retreved from ~cjln/lbsvm/, 20 AUTHORS Dawe L s a senor engneer of PetroChna, born n Hebe Provnce, Chna, on May 5th, 969. He got doctoral degree of petroleum geology from Chna Unversty of Geoscences, Bejng, Chna n 996. Snce then, he has worked as an engneer of geophyscs, petroleum geology and IT n dfferent departments (such as Bureau of Geophyscal Prospectng, Research Insttute of Petroleum Exploraton and Development) of PetroChna. He has publshed several books and more than 40 artcles. Two of the major publshed books are Tectonc types of Ol and Gas Basns n Chna (Bejng: Petroleum Industry Press, 2003) and Neotectonsm and Hydrocarbon Accumulaton n Huanghua Depresson, Chna (n Chnese) (Wuhan, Hube Provnce: Chna Unversty of Geoscences Press, 2006). Two of hs mportant publshed artcles are: Ten crtcal factors for the evaluaton of enterprse nformatzaton benefts (n Chnese, 204), Dscusson on nformatzaton n PetroChna (n Chnese, 2009). Hs present job s to buld and manage PUP IM and DM systems for PetroChna. Dr. L s a member of Chna Geologcal Socety and Chna Petroleum Socety. 2 Guangren Sh s a professor of PetroChna, born n Shangha, Chna n February, 940. Hs research contans two felds of basn modelng (petroleum system) and data mnng for geoscences. He has publshed 3 books and about 2 artcles on data mnng

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Application research on rough set -neural network in the fault diagnosis system of ball mill

Application research on rough set -neural network in the fault diagnosis system of ball mill Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(4):834-838 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 Applcaton research on rough set -neural network n the

More information

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) ,

A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) , A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS Dr. Derald E. Wentzen, Wesley College, (302) 736-2574, wentzde@wesley.edu ABSTRACT A lnear programmng model s developed and used to compare

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

Which Separator? Spring 1

Which Separator? Spring 1 Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal

More information

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

Orientation Model of Elite Education and Mass Education

Orientation Model of Elite Education and Mass Education Proceedngs of the 8th Internatonal Conference on Innovaton & Management 723 Orentaton Model of Elte Educaton and Mass Educaton Ye Peng Huanggang Normal Unversty, Huanggang, P.R.Chna, 438 (E-mal: yepeng@hgnc.edu.cn)

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Regularized Discriminant Analysis for Face Recognition

Regularized Discriminant Analysis for Face Recognition 1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

Polynomial Regression Models

Polynomial Regression Models LINEAR REGRESSION ANALYSIS MODULE XII Lecture - 6 Polynomal Regresson Models Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Test of sgnfcance To test the sgnfcance

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

Ensemble Methods: Boosting

Ensemble Methods: Boosting Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement

More information

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k) ISSN 1749-3889 (prnt), 1749-3897 (onlne) Internatonal Journal of Nonlnear Scence Vol.17(2014) No.2,pp.188-192 Modfed Block Jacob-Davdson Method for Solvng Large Sparse Egenproblems Hongy Mao, College of

More information

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

2016 Wiley. Study Session 2: Ethical and Professional Standards Application 6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Natural Language Processing and Information Retrieval

Natural Language Processing and Information Retrieval Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support

More information

SDMML HT MSc Problem Sheet 4

SDMML HT MSc Problem Sheet 4 SDMML HT 06 - MSc Problem Sheet 4. The recever operatng characterstc ROC curve plots the senstvty aganst the specfcty of a bnary classfer as the threshold for dscrmnaton s vared. Let the data space be

More information

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law:

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law: CE304, Sprng 2004 Lecture 4 Introducton to Vapor/Lqud Equlbrum, part 2 Raoult s Law: The smplest model that allows us do VLE calculatons s obtaned when we assume that the vapor phase s an deal gas, and

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

On the correction of the h-index for career length

On the correction of the h-index for career length 1 On the correcton of the h-ndex for career length by L. Egghe Unverstet Hasselt (UHasselt), Campus Depenbeek, Agoralaan, B-3590 Depenbeek, Belgum 1 and Unverstet Antwerpen (UA), IBW, Stadscampus, Venusstraat

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Assessment of Site Amplification Effect from Input Energy Spectra of Strong Ground Motion

Assessment of Site Amplification Effect from Input Energy Spectra of Strong Ground Motion Assessment of Ste Amplfcaton Effect from Input Energy Spectra of Strong Ground Moton M.S. Gong & L.L Xe Key Laboratory of Earthquake Engneerng and Engneerng Vbraton,Insttute of Engneerng Mechancs, CEA,

More information

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Journal of Chemical and Pharmaceutical Research, 2014, 6(12): Research Article

Journal of Chemical and Pharmaceutical Research, 2014, 6(12): Research Article Avalable onlne www.jocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(12):635-639 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 Analyss on nfluencng factors of well productvty n tght

More information

Implicit Integration Henyey Method

Implicit Integration Henyey Method Implct Integraton Henyey Method In realstc stellar evoluton codes nstead of a drect ntegraton usng for example the Runge-Kutta method one employs an teratve mplct technque. Ths s because the structure

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

Grey prediction model in world women s pentathlon performance prediction applied research

Grey prediction model in world women s pentathlon performance prediction applied research Avalable onlne www.jocpr.com Journal of Chemcal and Pharmaceutcal Research, 4, 6(6):36-4 Research Artcle ISSN : 975-7384 CODEN(USA) : JCPRC5 Grey predcton model n world women s pentathlon performance predcton

More information

Calculation of time complexity (3%)

Calculation of time complexity (3%) Problem 1. (30%) Calculaton of tme complexty (3%) Gven n ctes, usng exhaust search to see every result takes O(n!). Calculaton of tme needed to solve the problem (2%) 40 ctes:40! dfferent tours 40 add

More information

Chapter 11: Simple Linear Regression and Correlation

Chapter 11: Simple Linear Regression and Correlation Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests

More information

Laboratory 3: Method of Least Squares

Laboratory 3: Method of Least Squares Laboratory 3: Method of Least Squares Introducton Consder the graph of expermental data n Fgure 1. In ths experment x s the ndependent varable and y the dependent varable. Clearly they are correlated wth

More information

REAL-TIME DETERMINATION OF INDOOR CONTAMINANT SOURCE LOCATION AND STRENGTH, PART II: WITH TWO SENSORS. Beijing , China,

REAL-TIME DETERMINATION OF INDOOR CONTAMINANT SOURCE LOCATION AND STRENGTH, PART II: WITH TWO SENSORS. Beijing , China, REAL-TIME DETERMIATIO OF IDOOR COTAMIAT SOURCE LOCATIO AD STREGTH, PART II: WITH TWO SESORS Hao Ca,, Xantng L, Wedng Long 3 Department of Buldng Scence, School of Archtecture, Tsnghua Unversty Bejng 84,

More information

Army Ants Tunneling for Classical Simulations

Army Ants Tunneling for Classical Simulations Electronc Supplementary Materal (ESI) for Chemcal Scence. Ths journal s The Royal Socety of Chemstry 2014 electronc supplementary nformaton (ESI) for Chemcal Scence Army Ants Tunnelng for Classcal Smulatons

More information

A Method for Filling up the Missed Data in Information Table

A Method for Filling up the Missed Data in Information Table A Method for Fllng up the Mssed Data Gao Xuedong, E Xu, L Teke & Zhang Qun A Method for Fllng up the Mssed Data n Informaton Table Gao Xuedong School of Management, nversty of Scence and Technology Beng,

More information

Research Article Green s Theorem for Sign Data

Research Article Green s Theorem for Sign Data Internatonal Scholarly Research Network ISRN Appled Mathematcs Volume 2012, Artcle ID 539359, 10 pages do:10.5402/2012/539359 Research Artcle Green s Theorem for Sgn Data Lous M. Houston The Unversty of

More information

is the calculated value of the dependent variable at point i. The best parameters have values that minimize the squares of the errors

is the calculated value of the dependent variable at point i. The best parameters have values that minimize the squares of the errors Multple Lnear and Polynomal Regresson wth Statstcal Analyss Gven a set of data of measured (or observed) values of a dependent varable: y versus n ndependent varables x 1, x, x n, multple lnear regresson

More information

Laboratory 1c: Method of Least Squares

Laboratory 1c: Method of Least Squares Lab 1c, Least Squares Laboratory 1c: Method of Least Squares Introducton Consder the graph of expermental data n Fgure 1. In ths experment x s the ndependent varable and y the dependent varable. Clearly

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,

More information

FORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES

FORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES Proceedngs of the Fourth Internatonal Conference on Machne Learnng and Cybernetcs, Guangzhou, 8- August 005 FORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES DING-ZHOU CAO, SU-LIN PANG, YUAN-HUAI

More information

Uncertainty in measurements of power and energy on power networks

Uncertainty in measurements of power and energy on power networks Uncertanty n measurements of power and energy on power networks E. Manov, N. Kolev Department of Measurement and Instrumentaton, Techncal Unversty Sofa, bul. Klment Ohrdsk No8, bl., 000 Sofa, Bulgara Tel./fax:

More information

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger JAB Chan Long-tal clams development ASTIN - September 2005 B.Verder A. Klnger Outlne Chan Ladder : comments A frst soluton: Munch Chan Ladder JAB Chan Chan Ladder: Comments Black lne: average pad to ncurred

More information

since [1-( 0+ 1x1i+ 2x2 i)] [ 0+ 1x1i+ assumed to be a reasonable approximation

since [1-( 0+ 1x1i+ 2x2 i)] [ 0+ 1x1i+ assumed to be a reasonable approximation Econ 388 R. Butler 204 revsons Lecture 4 Dummy Dependent Varables I. Lnear Probablty Model: the Regresson model wth a dummy varables as the dependent varable assumpton, mplcaton regular multple regresson

More information

PHYS 450 Spring semester Lecture 02: Dealing with Experimental Uncertainties. Ron Reifenberger Birck Nanotechnology Center Purdue University

PHYS 450 Spring semester Lecture 02: Dealing with Experimental Uncertainties. Ron Reifenberger Birck Nanotechnology Center Purdue University PHYS 45 Sprng semester 7 Lecture : Dealng wth Expermental Uncertantes Ron Refenberger Brck anotechnology Center Purdue Unversty Lecture Introductory Comments Expermental errors (really expermental uncertantes)

More information

829. An adaptive method for inertia force identification in cantilever under moving mass

829. An adaptive method for inertia force identification in cantilever under moving mass 89. An adaptve method for nerta force dentfcaton n cantlever under movng mass Qang Chen 1, Mnzhuo Wang, Hao Yan 3, Haonan Ye 4, Guola Yang 5 1,, 3, 4 Department of Control and System Engneerng, Nanng Unversty,

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

Linear Classification, SVMs and Nearest Neighbors

Linear Classification, SVMs and Nearest Neighbors 1 CSE 473 Lecture 25 (Chapter 18) Lnear Classfcaton, SVMs and Nearest Neghbors CSE AI faculty + Chrs Bshop, Dan Klen, Stuart Russell, Andrew Moore Motvaton: Face Detecton How do we buld a classfer to dstngush

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Multiple Sound Source Location in 3D Space with a Synchronized Neural System

Multiple Sound Source Location in 3D Space with a Synchronized Neural System Multple Sound Source Locaton n D Space wth a Synchronzed Neural System Yum Takzawa and Atsush Fukasawa Insttute of Statstcal Mathematcs Research Organzaton of Informaton and Systems 0- Mdor-cho, Tachkawa,

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM

ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM An elastc wave s a deformaton of the body that travels throughout the body n all drectons. We can examne the deformaton over a perod of tme by fxng our look

More information

Convexity preserving interpolation by splines of arbitrary degree

Convexity preserving interpolation by splines of arbitrary degree Computer Scence Journal of Moldova, vol.18, no.1(52), 2010 Convexty preservng nterpolaton by splnes of arbtrary degree Igor Verlan Abstract In the present paper an algorthm of C 2 nterpolaton of dscrete

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9 Chapter 9 Correlaton and Regresson 9. Correlaton Correlaton A correlaton s a relatonshp between two varables. The data can be represented b the ordered pars (, ) where s the ndependent (or eplanator) varable,

More information

Kristin P. Bennett. Rensselaer Polytechnic Institute

Kristin P. Bennett. Rensselaer Polytechnic Institute Support Vector Machnes and Other Kernel Methods Krstn P. Bennett Mathematcal Scences Department Rensselaer Polytechnc Insttute Support Vector Machnes (SVM) A methodology for nference based on Statstcal

More information

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015 CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

x i1 =1 for all i (the constant ).

x i1 =1 for all i (the constant ). Chapter 5 The Multple Regresson Model Consder an economc model where the dependent varable s a functon of K explanatory varables. The economc model has the form: y = f ( x,x,..., ) xk Approxmate ths by

More information

Statistics for Economics & Business

Statistics for Economics & Business Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable

More information

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton

More information

Lecture 10 Support Vector Machines. Oct

Lecture 10 Support Vector Machines. Oct Lecture 10 Support Vector Machnes Oct - 20-2008 Lnear Separators Whch of the lnear separators s optmal? Concept of Margn Recall that n Perceptron, we learned that the convergence rate of the Perceptron

More information

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity Week3, Chapter 4 Moton n Two Dmensons Lecture Quz A partcle confned to moton along the x axs moves wth constant acceleraton from x =.0 m to x = 8.0 m durng a 1-s tme nterval. The velocty of the partcle

More information

Amiri s Supply Chain Model. System Engineering b Department of Mathematics and Statistics c Odette School of Business

Amiri s Supply Chain Model. System Engineering b Department of Mathematics and Statistics c Odette School of Business Amr s Supply Chan Model by S. Ashtab a,, R.J. Caron b E. Selvarajah c a Department of Industral Manufacturng System Engneerng b Department of Mathematcs Statstcs c Odette School of Busness Unversty of

More information

THE SUMMATION NOTATION Ʃ

THE SUMMATION NOTATION Ʃ Sngle Subscrpt otaton THE SUMMATIO OTATIO Ʃ Most of the calculatons we perform n statstcs are repettve operatons on lsts of numbers. For example, we compute the sum of a set of numbers, or the sum of the

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

Support Vector Machines CS434

Support Vector Machines CS434 Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? Intuton of Margn Consder ponts A, B, and C We

More information

Support Vector Machines

Support Vector Machines CS 2750: Machne Learnng Support Vector Machnes Prof. Adrana Kovashka Unversty of Pttsburgh February 17, 2016 Announcement Homework 2 deadlne s now 2/29 We ll have covered everythng you need today or at

More information

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS IJRRAS 8 (3 September 011 www.arpapress.com/volumes/vol8issue3/ijrras_8_3_08.pdf NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS H.O. Bakodah Dept. of Mathematc

More information

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6 Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.

More information

Valuated Binary Tree: A New Approach in Study of Integers

Valuated Binary Tree: A New Approach in Study of Integers Internatonal Journal of Scentfc Innovatve Mathematcal Research (IJSIMR) Volume 4, Issue 3, March 6, PP 63-67 ISS 347-37X (Prnt) & ISS 347-34 (Onlne) wwwarcournalsorg Valuated Bnary Tree: A ew Approach

More information

LNG CARGO TRANSFER CALCULATION METHODS AND ROUNDING-OFFS

LNG CARGO TRANSFER CALCULATION METHODS AND ROUNDING-OFFS CARGO TRANSFER CALCULATION METHODS AND ROUNDING-OFFS CONTENTS 1. Method for determnng transferred energy durng cargo transfer. Calculatng the transferred energy.1 Calculatng the gross transferred energy.1.1

More information

Interactive Bi-Level Multi-Objective Integer. Non-linear Programming Problem

Interactive Bi-Level Multi-Objective Integer. Non-linear Programming Problem Appled Mathematcal Scences Vol 5 0 no 65 3 33 Interactve B-Level Mult-Objectve Integer Non-lnear Programmng Problem O E Emam Department of Informaton Systems aculty of Computer Scence and nformaton Helwan

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

Feature Selection in Multi-instance Learning

Feature Selection in Multi-instance Learning The Nnth Internatonal Symposum on Operatons Research and Its Applcatons (ISORA 10) Chengdu-Juzhagou, Chna, August 19 23, 2010 Copyrght 2010 ORSC & APORC, pp. 462 469 Feature Selecton n Mult-nstance Learnng

More information

Time-Varying Systems and Computations Lecture 6

Time-Varying Systems and Computations Lecture 6 Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy

More information

A MODIFIED METHOD FOR SOLVING SYSTEM OF NONLINEAR EQUATIONS

A MODIFIED METHOD FOR SOLVING SYSTEM OF NONLINEAR EQUATIONS Journal of Mathematcs and Statstcs 9 (1): 4-8, 1 ISSN 1549-644 1 Scence Publcatons do:1.844/jmssp.1.4.8 Publshed Onlne 9 (1) 1 (http://www.thescpub.com/jmss.toc) A MODIFIED METHOD FOR SOLVING SYSTEM OF

More information

U-Pb Geochronology Practical: Background

U-Pb Geochronology Practical: Background U-Pb Geochronology Practcal: Background Basc Concepts: accuracy: measure of the dfference between an expermental measurement and the true value precson: measure of the reproducblty of the expermental result

More information

On the Interval Zoro Symmetric Single-step Procedure for Simultaneous Finding of Polynomial Zeros

On the Interval Zoro Symmetric Single-step Procedure for Simultaneous Finding of Polynomial Zeros Appled Mathematcal Scences, Vol. 5, 2011, no. 75, 3693-3706 On the Interval Zoro Symmetrc Sngle-step Procedure for Smultaneous Fndng of Polynomal Zeros S. F. M. Rusl, M. Mons, M. A. Hassan and W. J. Leong

More information

Transient Stability Assessment of Power System Based on Support Vector Machine

Transient Stability Assessment of Power System Based on Support Vector Machine ransent Stablty Assessment of Power System Based on Support Vector Machne Shengyong Ye Yongkang Zheng Qngquan Qan School of Electrcal Engneerng, Southwest Jaotong Unversty, Chengdu 610031, P. R. Chna Abstract

More information

Uncertainty and auto-correlation in. Measurement

Uncertainty and auto-correlation in. Measurement Uncertanty and auto-correlaton n arxv:1707.03276v2 [physcs.data-an] 30 Dec 2017 Measurement Markus Schebl Federal Offce of Metrology and Surveyng (BEV), 1160 Venna, Austra E-mal: markus.schebl@bev.gv.at

More information

CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING INTRODUCTION

CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING INTRODUCTION CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING N. Phanthuna 1,2, F. Cheevasuvt 2 and S. Chtwong 2 1 Department of Electrcal Engneerng, Faculty of Engneerng Rajamangala

More information