COMPARATIVE STUDIES OF METAMODELING TECHNIQUES UNDER MULTIPLE MODELING CRITERIA

Size: px
Start display at page:

Download "COMPARATIVE STUDIES OF METAMODELING TECHNIQUES UNDER MULTIPLE MODELING CRITERIA"

Transcription

1 AIAA COMPARATIVE STUDIES OF METAMODELING TECHNIQUES UNDER MULTIPLE MODELING CRITERIA Ruchen Jn * and We Chen Department of Mechancal Engneerng Unversty of Illnos at Chcago Chcago, Illnos Tmothy W. Smpson Department of Mechancal & Nuclear Engneerng The Pennsylvana State Unversty Unversty Par PA 680 Abstract Despte the advances n computer capacty, the enormous computatonal cost of complex engneerng smulatons makes t mpractcal to rely exclusvely on smulaton for the purpose of desgn optmzaton. To cut down the cost, surrogate models, also known as metamodels, are constructed from and then used n leu of the actual smulaton models. In the paper, we systematcally compare four popular metamodelng technques Polynomal Regresson, Multvarate Adaptve Regresson Splnes, Radal Bass Functons, and Krgng based on multple performance crtera usng fourteen test problems representng dfferent classes of problems. Our obectve n ths study s to nvestgate the advantages and dsadvantages these four metamodelng technques usng multple modelng crtera and multple test problems rather than a sngle measure of mert and a sngle test problem. Introducton Smulaton-based analyss tools are fndng ncreased use durng prelmnary desgn to explore desgn alternatves at the system level. In spte of advances n computer capacty and speed, the enormous computatonal cost of complex, hgh fdelty scentfc and engneerng smulatons makes t mpractcal to rely exclusvely on smulaton codes for the purpose of desgn optmzaton. A preferable strategy s to utlze approxmaton models whch are often referred to as metamodels as they provde a model of the model (Klenen, 987, replacng the expensve smulaton model durng the desgn and optmzaton process. Metamodelng technques have been wdely used for desgn evaluaton and optmzaton n many engneerng applcatons; a comprehensve revew of metamodelng Copyrght 000 by We Chen. Publshed by the, Inc. wth permsson. * Graduate research assstant. Assstant Professor, Member AIAA, correspondng author, wechen@uc.edu, Assstant Professor, Member AIAA. applcatons n mechancal and aerospace systems can be found n (Smpson, et al., 997 and wll therefore not be repeated here. For the nterested reader, a revew of metamodelng applcatons n structural optmzaton can be found n (Barthelemy and Haftka, 993; for metamodelng applcatons n multdscplnary desgn optmzaton, see (Sobeszczansk-Sobesk and Haftka, 997. A varety of metamodelng technques exst; the Response Surface Methodology (Box, et al.; 978; Myers and Montgomery, 995 and Artfcal Neural Network (ANN methods (Smth, 993; Cheng and Ttterngton, 994 are two well known approaches for constructng smple and fast approxmatons of complex computer codes. An nterpolaton method known as Krgng s becomng wdely used for the desgn and analyss of computer experments (Sacks, et al., 989; Booker, et al., 999. Fnally, other statstcal technques that hold a lot of promse, such as Multvarate Adaptve Regresson Splnes (Fredman, 99 and radal bass functon approxmatons (Hardy, 97; Dyn, et al., 986 are begnnng to draw the attenton of many researchers. An mmedate queston that a desgner may have s on what bass the varous technques should be used? Moreover, s one technque superor to the others? Numerous examples that demonstrate the applcaton of one metamodelng technque or the other, typcally for a specfc applcaton exst; however, our survey reveals a lack of comprehensve comparatve studes of the varous technques, let alone standard procedures for testng the relatve merts of dfferent methods. In Smpson, et al. (998, krgng methods are compared aganst polynomal regresson models for the multdscplnary desgn optmzaton of an aerospke nozzle. Gunta, et al. (998 also compare krgng models and polynomal regresson models for two 5 and 0 varable test problems. In Varadaraan, et al. (000, Artfcal Neural Network (ANN methods are compared wth polynomal regresson models for the engne desgn problem n modelng the nonlnear thermodynamc behavor. In Yang, et al., (000, four approxmaton methods enhanced Multvarate Adaptve Regresson Splnes (MARS, Stepwse

2 Regresson, ANN, and the Movng Least Square are compared for the constructon of safety related functons n automotve crash analyss, for a relatve small samplng sze. Smpson (999 presents the results of ongong work nvestgatng dfferent metamodelng technques response surfaces, krgng models, radal bass functons, and MARS models on a varety of engneerng test problems. Although the exstng studes provde useful nsghts nto the varous approaches consdered, a common lmtaton s that the tests are restrcted to a very small group of methods and test problems, and n many cases only one problem due to the expenses assocated wth testng. Moreover, when usng multple test problems, t s often dffcult to make comparsons between the test problems when they belong to dfferent classes of problems (e.g., lnear, quadratc, nonlnear, etc.. It s our belef that varous factors contrbute to the success of a gven metamodelng technque, rangng from the nonlnearty of the model behavor, to the dmensonalty and data samplng technque, to the nternal parameter settngs of the varous technques. We contend that nstead of usng accuracy as the only measure, multple metrcs for comparson should be consdered based on multple modelng crtera, such as accuracy, effcency, robustness, model transparency, and smplcty. Overall, the knowledge of the performance of dfferent metamodelng technques wth respect to dfferent modelng crtera s of utmost mportance to desgners when tryng to choose an approprate technque for a partcular applcaton. In ths wor we present prelmnary results from a systematc comparatve study whch provdes nsghtful observatons nto the performance of varous metamodelng technques under dfferent modelng crtera, and the mpact of the contrbutng factors to ther success. A set of mathematcal and engneerng problems has been selected to represent dfferent classes of problems wth dfferent degrees of nonlnearty, dfferent dmensons, and nosy versus smooth behavors. Relatve large, small, and scarce sample sets are also used for each test problem. Four promsng metamodelng technques, namely, Polynomal Regresson (PR, Krgng (KG, Multvarate Adaptve Regresson Splnes (MARS, and Radal Bass Functons (RBF, are compared n ths study. Although ANN s a well-known technque, t s not ncluded n our study due to the large amount of tral-and-error assocated wth the use of ths technque. Metamodelng Technques The prncple features of the four metamodelng technques compared n our study are descrbed n the followng sectons.. Polynomal Regresson (PR PR models have been appled by a number of researchers (Engelund, et al, 993; Unal, et al., 996; Vtal, et al., 997; Venkataraman, et al., 997; Venter, et al., 997; Chen, et al., 996; Smpson, et al., 997 n desgnng complex engneerng systems. A secondorder polynomal model can be expressed as: k k yˆ = β + β x + β x + β x x ( o = = When creatng PR models, t s possble to dentfy the sgnfcance of dfferent desgn factors drectly from the coeffcents n the normalzed regresson model. For problems wth a large dmenson, t s mportant to use lnear or second-order polynomal models to narrow the desgn varables to the most crtcal ones. In optmzaton, the smoothng capablty of polynomal regresson allows quck convergence of nosy functons (see, e.g., Gunta, et al., 994. In spte of the advantages, there s always a drawback when applyng PR to model hghly nonlnear behavors. Hgher-order polynomals can be used; however, nstabltes may arse (Barton, 99, or t may be too dffcult to take suffcent sample data to estmate all of the coeffcents n the polynomal equaton, partcularly n large dmensons. In ths wor lnear and second-order PR models are consdered.. Krgng Method (KG A krgng model postulates a combnaton of a polynomal model and departures of the form: k yˆ = β f ( x + Z( x, ( = where Z(x s assumed to be a realzaton of a stochastc process wth mean zero and spatal correlaton functon gven by Cov[Z(x,Z(x ] = σ R(x, x, (3 where σ s the process varance and R s the correlaton. A varety of correlaton functons can be chosen (cf., Smpson, et al., 998; however, the Gaussan correlaton functon proposed n (Sacks, et al., 989 s the most frequently used. Furthermore, f (x n Eqn. s typcally taken as a constant term. In our study, we use a constant term for f (x and a Gaussan correlaton functon wth p= and k θ parameters, one θ for each of the k dmensons n the desgn space. In addton to beng extremely flexble due to the wde range of the correlaton functons, the krgng method has advantages n that t provdes a bass for a stepwse algorthm to determne the mportant factors, and the same data can be used for screenng and buldng the predctor model (Welch, et al., 99. The maor dsadvantage of the krgng process s that model constructon can be very tme-consumng. Determnng the maxmum lkelhood estmates of the θ parameters used to ft the model s a k-dmensonal optmzaton

3 problem, whch can requre sgnfcant computatonal tme f the sample data set s large. Moreover, the correlaton matrx can become sngular f multple sample ponts are spaced close to one another or f the sample ponts are generated from partcular desgns. Fttng problems have been observed wth some full factoral desgns and central composte desgns when usng krgng models (Meckeshemer, et al., 000; Wlson, et al., 000. Fnally, the complexty of the method and the lack of commercal software may hnder ths technque from beng popular n the near term (Smpson, et al., Multvarate Adaptve Regresson Splnes (MARS Multvarate Adaptve Regresson Splnes (Fredman, 99 adaptvely selects a set of bass functons for approxmatng the response functon through a forward/backward teratve approach. A MARS model can be wrtten as: M y ˆ = a B m m (x, (4 m= where a m s the coeffcent of the expanson, and B m, the bass functons, can be represented as: Km q B m (x= [ s ( x t ] k= k, m v( m m (5 + where K m s the number of factors (nteracton order n the m-th bass functon, s m =+/-, x v(m s the v-th varable, v(m n, and t m s a knot locaton on each of the correspondng varables. The subscrpt + means the functon s a truncated power functon: q q [ ( ] [ s m ( xv( m t m ] s m( xv( m t m s x t = > 0 m v( m m + 0 otherwse (6 Compared to other technques, the use of MARS for engneerng desgn applcatons s relatvely new. Bua, et al. (990 use MARS for extensve analyss of data concernng memory usage n electronc swtches. Wang, et al. (999, compare MARS to lnear, secondorder, and hgher-order regresson models for a fve varable automoble structural analyss. Fredman (99 uses the MARS procedure to approxmate behavor of performance varables n a smple alternatng current seres crcut. The maor advantages of usng the MARS procedure appears to be accuracy and maor reducton n computatonal cost assocated wth constructng the metamodel..4 Radal Bass Functons (RBF Radal bass functons (RBF have been developed for scattered multvarate data nterpolaton (Hardy, 97; Dyn, et al., 986. The method uses lnear combnatons of a radally symmetrc functon based on Eucldean dstance or other such metrc to approxmate response functons. A radal bass functon model can be expressed as: yˆ = x x (7 a 0 where a s the coeffcent of the expresson and x 0 s the observed nput. Radal bass functon approxmatons have been shown to produce good fts to arbtrary contours of both determnstc and stochastc response functons (Powell, 987. Tu and Barton (997 found that RBF approxmatons provde effectve metamodels for electronc crcut smulaton models. Meckeshemer, et al. (000 use the method for constructng metamodels for a desk lamp desgn example, whch has both contnuous and dscrete response functons. 3 Test Problems and Test Scheme 3. Features of Test Problems To test the effectveness of varous approaches to dfferent classes of problems, 4 test problems are selected and classfed based on the followng representatve features of engneerng desgn problems. Problem Scale. Two relatve scales are consdered: large (the number of varables 0 and small (the number of varables =,3. Nonlnearty of the performance behavor. For convenence, we classfy the problems nto two categores: low-order nonlnearty (f the square regresson 0.99 when usng frst or second-order polynomal model and hgh-order nonlnearty (otherwse. Nosy versus smooth behavor. In some cases, numercal smulaton error or other nosy causes cannot be elmnated. In our study, the nosy behavor s artfcally created usng local varatons of a smooth functon. A summary of the features of the 4 test problems s gven n Table ; the test problems are descrbed n more detal n the next secton. Type Mathematcal Vehcle Handlng Table. Features of Test Problems Problem No. Nonlnearty Order Scale (# of Inputs Hgh (n=0 NO Low (n=0 NO 3 Hgh (n=0 NO 4 Low (n=0 NO 5 Low (n=6 NO 6 Hgh (n= NO 7 Hgh (n= NO 8 Low (n= NO 9 Hgh (n=3 NO 0 Hgh (n=3 NO Low (n=3 NO Low (n= NO Nosy Behavor 3 Low (n= YES 4 Hgh (n=4 NO 3

4 3. Descrpton of Test Problems Thrteen mathematcal problems are utlzed n our study (see Appendx. These test problems are chosen from (Hock and Schttkows 98 whch offers 80 problems for testng nonlnear optmzaton algorthms. Provded n the Appendx are the mathematcal functons for each of these problems. Whle some of the functons exhbt smooth low-order nonlnear behavor, the others are hghly nonlnear functons that pose challenges for many metamodelng technques. Fgure A. are the grd plots of hghly nonlnear problems except problem 7, of whch the plots from dfferent approaches are compared n Fgure. Meanwhle, Problem 4 s a real engneerng problem that calls for better vehcle desgn to mprove a vehcle s handlng characterstcs, partcularly the preventon of rollover. The problem statement s frst gven n (Chen, et. al., 999. The smulator used s the ntegrated computer tool ArcSm (ArcSm, 997; Sayers and Rley, 996 developed at the Unversty of Mchgan for smulatng and analyzng the dynamc behavor of 6-axle tractor-semtralers. Each smulaton takes more than three mnutes to run on a Sun UltraSparc workstaton. The use of ArcSm for the purpose of optmzaton demands heavy computatonal costs. In ths study, 4 nput varables are consdered whch nclude nne suspenson and vehcle parameters as desgn varables and fve uncontrollable factors for steerng and brakng. The response of nterest s the vehcle handlng performance, whch s measured by the rollover metrc. The prevous studes (Chen, et al., 999 ndcate that the rollover metrc has a hghly nonlnear dependence on the control and nose varables, especally for brake and steerng levels. 3.3 Data Samplng We are nterested n examnng the performance of varous metamodelng technques when dfferent szes of data samples are used for model formaton, as shown n Table. For dfferent problem scales, dfferent sets of sample data such as scarce, small, and large sets are used as tranng ponts for model formaton. For large-scale problems, Latn Hypercubes (McKay, et al., 979 are used to generate the tranng ponts n all cases because ths method provdes good unformty and flexblty on the sze of the sample. The second order polynomal models have k = (n+(n+/ coeffcents for n desgn varables. Gunta, et al. (994 and Kaufman, et al. (996 found that.5k sample ponts for 5-0 varable problems to 4.5k sample ponts for 0-30 varable problems are necessary to obtan reasonably accurate second-order polynomal models. Therefore, for large-scale problems, 3k sample ponts are selected and are referred to as a large sample set. For complex and tme-consumng problems, t s preferable to use fewer samples. For ths reason, scarce sample sets wth 3n ponts are tested. In addton to large and scarce sample sets, small sample sets wth 0n are also used. For small-scale problems, only small and large sample sets are consdered. Also shown n Table s the number of confrmaton ponts used for checkng the accuracy of each model. The Monte Carlo method s used to generate confrmaton ponts. Table. Expermental Desgns for Test Problems Tranng Ponts Scarce Set Set Set Confrmaton Ponts Scale Problem Latn Hypercube (3n Scale Problem N/A Latn Hypercube Latn Hypercube (0n (9 f n=, 7 f n=3 Latn Hypercube Latn Hypercube 3 ( n + ( n + (00 f n=, 5 f ( n=3 Monte-Carlo Method (500 for vehcle problem, for others 3.4 Metrcs for Performance Measures In accordance wth havng multple metamodelng crtera, the performance of each metamodelng technque s measured from the followng aspects. Accuracy the capablty of predctng the system response over the desgn space of nterest. Robustness the capablty of achevng good accuracy for dfferent the dfferent problem types and sample szes. Effcency the computatonal effort requred for constructng the metamodel and for predctng the response for a set of new ponts by metamodels. Transparency the capablty of llustratng explct relatonshps between nput varables and responses. Conceptual Smplcty ease of mplementaton. Smple methods should requre mnmum user nput and be easly adapted to each problem. For accuracy, the goodness of ft obtaned from tranng data s not suffcent to assess the accuracy of newly predcted ponts. For ths reason, addtonal confrmaton samples (see Table are used to verfy the accuracy of the metamodels. To provde a more complete pcture of metamodel accuracy, three dfferent metrcs are used: R Square, Relatve Average Absolute Error, and Relatve Maxmum Absolute Error. The equatons for these three measures are gven n Eqns. (8 to (0, respectvely. 4

5 a R Square R = n ( y yˆ = = n ( y y = MSE Varance (8 where ŷ s the correspondng predcted value for the observed value y ; y s the mean of the observed values. Whle MSE (Mean Square Error represents the departure of the metamodel from the real smulaton model, the varance captures how rregular the problem s. The larger the value of R Square, the more accurate the metamodel. b Relatve Average Absolute Error (RAAE n y yˆ RAAE = =, (9 n * STD where STD stands for standard devaton. The smaller the value of RAAE, the more accurate the metamodel. c Relatve Maxmum Absolute Error (RMAE max( y yˆ y yˆ yn yˆ,,..., n RMAE = (0 STD Whle the RAAE s usually hghly correlated wth MSE and thus R Square, RMAE s not necessarly. RMAE ndcates large error n one regon of the desgn space even though the overall accuracy ndcated by R Square and RAAE can be very good. Therefore, a small RMAE s preferred; however, snce ths metrc cannot show the overall performance n the desgn space, t s not as mportant as R Square and RAAE. 4 Results and Comparson Based on the proposed scheme for comparatve study, 36 metamodels are created for the 4 test problems (see Table, usng dfferent sets of sample data (see Table, and based on four dfferent metamodelng technques (see Sectons.-.4. Dfferent technques are compared based on the results from confrmaton ponts. 4. Accuracy and Robustness To llustrate the performance of the metamodelng technques under dfferent crcumstances (e.g., nonlnearty, problem sze, and sample sze, multple bar-charts are provded. Whle the mean ndcates the average accuracy of a technque, the varance llustrates the robustness of the accuracy. Fnally, whle the heght of a bar ndcates the magntude of accuracy, the dfferences between heghts of multple bars llustrate the mpact of a partcular contrbutng factor. 4.. Overall Performance Illustrated n Fgures and are the mean and varance of the three accuracy metrcs for all metamodels whch consder dfferent orders of nonlnearty, dfferent problem szes, and dfferent sample szes. For the mean values, the larger the R Square, the better the accuracy s; however, for both RAAE and RMAE, the smaller value ndcates better accuracy. For varance, a smaller value always ndcates hgher robustness. R Square and RAAE R Square and RAAE 9.00E E E E E E E-0.00E-0.00E-0 R Square RAAE RMAE 3.00E+00.50E+00.00E+00.50E+00.00E E-0 Fgure. The Mean of Accuracy Metrcs 4.50E E E E-0.50E-0.00E-0.50E-0.00E E-0 R Square RAAE RMAE 3.50E E+00.50E+00.00E+00.50E+00.00E E-0 Fgure. The Varance of Accuracy Metrcs Fgure shows that the average accuraces of RBF and KG for all test cases are among the best n the group; ther values are very close to each other. RBF s slghtly better than KG n R Square (all close to 0.8, but KG s better than RBF n both RAAE and RMAE. The average accuracy of PR s thrd n the group. As revealed n Secton 4..3, the poor average performance of MARS s due to the defcency of usng MARS when scare set of samples s avalable. In terms of the robustness of the accuracy for all test cases, RBF s dstnctly the best for all three accuracy measures. Overall, RBF s shown to be the best approach n terms of ts average accuracy and robustness when handlng all types of problems for any amount of samples. 4.. Performance for Dfferent Types of Problems Fgures 3-6 show the mean and varance of R Square of the metamodels for dfferent types of problems. In Fgures 3 and 4, Hgh and Low represent the nonlnearty of problems, whle RMAE RMAE 5

6 and represent problem scale. So for example, Hgh means a hgh-order nonlnear and large scale problem. The values n Fgures 3 and 4 are derved based on the data from all sample szes (large, small, and scare. It s noted that for hgh-order nonlnear and large scale problems, RBF performs best n terms of both average accuracy and robustness. For low-order nonlnear and large scale problems, KG performs best n terms of both average accuracy and robustness. For hgh-order nonlnear and small scale problems, RBF performs best n terms of both average accuracy and robustness. For low-order nonlnear and small scale problems, PR performs best n terms of both average accuracy and robustness. better than PR. We also observe that each method has dstnctvely better accuracy for low-order nonlnear problems than for hgh-order nonlnear problems, whch matches well wth our ntuton. The dfference s the most sgnfcant for PR: whle the mean of R Square s close to for low-order nonlnear problems, t s less than 0.35 for hgh-order nonlnear problems. Except for MARS (due to the defcency for scare set of samples, the accuracy of the other three methods s acceptable for low-order nonlnear problems wth any sze sample set. However, the robustness, although small for RBF, KG, and PR when the model s loworder nonlnear, becomes larger when the problems are hgh-order nonlnear. The mpact s the most sgnfcant for KG and PR..0E+00.00E+00.0E+00.00E E E E E E E-0.00E-0.00E-0 Hgh Low Hgh Low Hgh Order Low Order Scale Scale Fgure 3. The Mean of R Square for Dfferent Types of Problems (a Fgure 5. The Mean of R Square for Dfferent Types of Problems (b 7.00E E E E E E E E E-0.00E-0.00E-0.00E-0.00E-0 Hgh Low Hgh Low Hgh Order Low Oreder Scale Scale Fgure 4. The Varance of R Square for Dfferent Types of Problems (a In Fgures 5 and 6, the average accuracy and ts robustness are derved for sngle contrbutng factors (e.g., hgher-order nonlnear based on all the test data belong that that category. It ndcates that for hghorder nonlnear problems, RBF performs best n terms of both average accuracy and robustness. For loworder nonlnear problems, the average accuracy of KG and PR s very close, whle the robustness of KG s slghtly better than PR. So, overall, KG s slghtly Fgure 6. The Varance of R Square for Dfferent Types of Problems (b For large scale problems, the average accuracy of RBF and KG s very close, whle the robustness of RBF s better than KG. So, overall, RBF s the best. For small scale problems, RBF s the best agan n terms of both average accuracy and robustness. It s also found that problem scale has lttle mpact on the performance of RBF. Although the mpact of problem scale on the average accuracy of KG s also small, the mpact on robustness for KG s rather large. 6

7 4..3 Performance under Dfferent Sample Sze Fgures 7 and 8 show the performance of metamodelng technques for types of problems under dfferent sample szes (large, small and scarce. For large set samples, the performances of MARS, RBF and KG are very close not only n average accuracy but also n robustness, whle the performance of PR s worse than the others. We cannot tell overall whch s the best because KG performs slghtly better than RBF and MARS n average accuracy whle MARS s more robust than KG and RBF. RBF performs best for small set samples both n average accuracy and robustness. Although KG also performs well n average accuracy, t s not as robust. For scarce sample sets, the average accuracy of RBF, KG and PR are close but the robustness of RBF s the best. Therefore, for scarce set samples, RBF performs best overall. It s also noted that the sample sze has the largest mpact on MARS, for both the mean and varance of accuracy. When small or scarce sample sets are used, the accuracy of MARS s low (R < Ths s because MARS fals to work when the sze of samples s too small. The varances of the accuracy (robustness are shown to be very small for MARS, RBF, and KG when large sample sets are used. Fgures 9 and 0 further llustrate the performance of metamodelng technques for dfferent sample szes when handlng the most dffcult stuaton,.e., large scale and hgh-order nonlnear problems. It shows the average accuracy of MARS s the best when large sample sets are used for ths type of problem. For small sample sets, MARS also performs best f average accuracy and robustness are both consdered (although RBF performs best n average accuracy. However, ts performance deterorates sgnfcantly when the sample sze becomes scarce, under whch RBF performs best. The mpact of sample sze on average accuracy and robustness s the smallest for RBF. The accuracy and robustness of PR s not stable, as for hgh nonlnear problem, ts performance s very problem-dependent and sample-dependent (not only the sample sze. 9.00E E E E E E E-0.00E E E E E E E E-0.00E-0.00E-0 Set Set Scarce Set.00E-0.00E-0 Set Set Scarce Set Fgure 9. The Mean of R Square under Dfferent Sample Scales for Scale and Hgh-Order Nonlnear Problems 6.00E E-0 Fgure 7. The Mean of R Square under Dfferent Sample Scales 4.00E E E E E-0.00E-0.00E-0 Set Set Scarce Set 3.00E-0.00E-0.00E-0 Set Set Scarce Set Fgure 8. The Varance of R Square under Dfferent Sample Scales Fgure 0. The Varance of R Square under Dfferent Sample Scales for Scale an d Hgh- Order Nonlnear Problems 4..4 Impact of Nosy Behavor Fgure shows the nfluence of nose on the performance of dfferent metamodelng technques. Only problems and 3 are compared here snce the functon n Problem 3 s the result of local varatons of the functon n problem 7

8 , a low-order nonlnear problem. From Fgure, t s found that Krgng s very senstve to the nose snce t nterpolates the data. Consequently, when estmatng the accuracy of the krgng metamodel for Problem 3 usng the non-nosy data from Problem, the krgng model does not yeld good predctons. PR performs the best because ts tendency to gve a smooth metamodel. MARS and RBF also perform well n ths test problem Smooth(Prob Nosy(Prob 3 Fgure. R Square Smooth Vs Nosy Problems a. From Analytcal Model b. From MARS c. From RBF d. From KG f. From PR Fgure Grd Plots for Problem 7 Due to the space lmtaton, we only provde here the sample grd plots of problem 7, whch has a loworder hghly nonlnear (wavng behavor, for comparng the accuracy of dfferent approaches. It s noted that KG s extremely accurate for modelng the wavng behavor for ths partcular case, whle the RBF s the second best. We also found that PR s not sutable at all for ths type of behavor, whle MARS captures the general trend but falls short n ts accuracy n local regons. 4. Effcency The effcency of each metamodelng technque s measured by the tme used for model constructon and new predctons. The tme depends on the problem scale and the sample sze, whch also depends hghly on the computer platform (MARS, PR, RBF are tested on a PC-pentum III 500 MHz machne whle KG s run on a Sun Ultra60 workstaton. Rough tme statstcs needed for model constructon and new predcton are provded n Tables 3 and 4, respectvely. Table 3. Tme Needed to Construct Metamodel Problem Scale/ Sample sze / / / / MARS 5-0s -5s <s <<s RBF 5-0s -s <s <<s KG -3h 0-5m -5m 0-30s PR -s <s <<s <<s Table 4. Tme Needed for 000 New Predctons Problem Scale/ Sample sze / / / MARS <<s <<s <<s <<s RBF 0-0s -5s -5s <s KG 5-0s -3s -3s <s PR <<s <<s <<s <<s / It s obvous that PR s the most effcent method both n model constructon and response predcton. It s found that, relatvely speakng, model constructon s very tme-consumng for Krgng. Krgng requres a k- dmensonal optmzaton to fnd the maxmum lkelhood estmates of the parameters used to ft the model, whch can become computatonally expensve when the problem scale and the sample sze are large. For predctons, RBF and krgng are relatvely slow because spatal dstances are needed for predcton. However, all the reported tme scales are consdered to be small compared to the tme needed for smulatons to obtan the tranng data n complex applcatons. 4.3 Transparency PR provdes the best transparency n terms of the functon relatonshp and the factor contrbutons (see 8

9 Eqn.. When usng MARS, models can be recast nto the form (Fredman, 99: y = a0 + f ( x + f ( x, x + fk ( x, x, xk + Km= Km= Km= 3 ( The frst sum s over all bass functons that nvolve only a sngle varable. The second sum s over all bass functons that nvolve exactly two varables, representng (f present two-varable nteractons. Smlarly, the thrd sum represents (f present the contrbutons from three-varable nteractons and so on. Therefore, MARS also provdes some model transparency. For RBF, an explct functon exsts (Eqn. 9; however, the factor contrbutons are not clear. The same holds true for Krgng; however, the theta parameters can be nterpreted wth some practce large theta values ndcate a hghly non-lnear functon whle small thetas ndcate a smooth functon wth lttle varaton. The nfluence of each nput varable on the fnal output response cannot be readly ascertaned from a krgng model, however. 4.4 Smplcty Applyng PR and RBF s relatvely straghtforward, and no parameters need to be specfed by a user. They are both easy to mplement. MARS and Krgng are more sophstcated n theory. For MARS, nternal parameters can be specfed whch may mprove or deterorate ts performance dependng on the problem. For krgng models, the user has the capablty to manpulate the optmzaton parameters used when constructng the model, and there are a varety of choces for the correlaton functon used n the model and for the underlyng global porton of the model. Whle the Gaussan correlaton functon and a constant underlyng global model are the most frequently used, t s currently unclear the extend to whch more complex krgng models mprove the accuracy of the model. 5 Summary The systematc comparatve study presented n ths paper has provded nsghtful observatons nto performance of varous metamodelng technques under dfferent modelng crtera. Based on the dscussons n Secton 4, we provde here a summary of our observatons and conclusons. In terms of the accuracy and robustness of the varous technques for dfferent types of problems (.e., dfferent orders of nonlnearty and problem scales, we can summarze our results as shown n Table 5. As shown, RBF excels n most of the categores. When large sample szes are used (assumng desgners have enough computatonal resource, MARS, RBF, and KG perform equally well n both average accuracy and robustness. For small and scarce sample sets, RBF performs the best when both average accuracy and robustness are consdered. It s noted that the performance of MARS deterorates when small or... scarce sample sets are used. For the most dffcult problems,.e., large scale and hgh-order nonlnear problems, the average accuracy of MARS s the best when the large sample sets are used. MARS also performs best f average accuracy and robustness are both consdered. However, ts performance deterorates sgnfcantly when the sample sze reduces to scare, under whch RBF performs best. For RBF, the mpact of sample sze on average accuracy and robustness s the least. Fnally, n our test problem wth nose, PR performs the best and MARS also works well; however, KG s very senstve to the nose because t nterpolates the sample data. From the above observatons, we can conclude that RBF s the most dependable method n most stuatons n terms of accuracy and robustness. Table 5. Summary of Best Methods Hgh-Order Nonlnear Low-Order Nonlnear Overall Scale RBF Krgng RBF Scale RBF PR RBF Overall RBF PR RBF In terms of effcency for metamodel constructon, KG can be very tme-consumng, especally for large scale problems wth large sample szes. Meanwhle, PR takes the least amount of tme for model buldng. For all technques, the tme needed for new predctons s consdered to be trval compared to the tme used for smulaton and model constructon. PR and MARS have good transparency, whch means we can obtan the contrbutons of each nput factor and the nteracton among them. Both RBF and KG are less transparent n ths sense. In terms of smplcty, PR and RBF are the easest to mplement, whle the users need to confgure the parameters for MARS and KG to acheve better accuracy. Although PR s not accurate for hghly nonlnear problems, t s easy to use and s very accurate for low order nonlnearty. It s therefore proposed that when constructng m etamodels, PR should mplemented frst to see f a reasonable ft can be obtaned. It s also observed that expermental desgn (data samplng also plays an mportant role, especally when buldng KG models. If the samples are not properly selected, KG may not work well and sometmes even fal to obtan the metamodel. Data samplng also has some nfluence on RBF. It s found for small scale problems, when usng full factoral desgn, the accuracy wll mprove. Although not tested, data samplng s 9

10 expected to mpact the performance of other technques. It s desred to obtan unform samples not only n onedmensonal proectons but also hgher dmensonal proectons, to mprove metamodelng performance. Adaptve samplng methods, whch sample each varable accordng ts contrbutons to the response and other varable nteractons, should also be nvestgated. Fnally, to mprove our study, more test problems wth large dmensons and some medum dmensons (between 3 to 0 varables need to be consdered. For the same sze of data sample, the effectveness of dfferent sample technques s also of nterest. Acknowledgements The support from the NSF/DMII s gratefully acknowledged. Support from Dr. Kam Ng, ONR 333, through the Naval Sea Systems Command under Contract No. N G-0058 s also acknowledged. References ArcSm User s Gude, Automotve Research Center, The Unversty of Mchgan, Ann Arbor, July 997. Barthelemy, J.-F. M. and Haftka, R. T., 993, "Approxmaton Concepts for Optmum Structural Desgn - A Revew," Structural Optmzaton, Vol. 5, pp Barton, R. R., 99, December 3-6, "Metamodels for Smulaton Input-Output Relatons," Proceedngs of the 99 Wnter Smulaton Conference (Swan, J. J., et al., eds., Arlngton, VA, IEEE, pp Booker, A. J., Denns, J. E., Jr., Fran P. D., Serafn, D. B., Torczon, V. and Trosset, M. W., 999, "A Rgorous Framework for Optmzaton of Expensve Functons by Surrogates," Structural Optmzaton, 7(, pp. -3. Box, G. E. P., Hunter, W. G. and Hunter, J. S., 978, Statstcs for Expermenter s, John Wley & Sons, NY. Chen, W., Allen, J. K., Mavrs, D. and Mstree, F., 996, "A Concept Exploraton Method for Determnng Robust Top-Level Specfcatons," Engneerng Optmzaton, Vol. 6, pp Chen, W., Garmella, R., and Mchelena, N., 999, Robust Desgn for Improved Vehcle Handlng Under a Range of Maneuver Condtons, 999 ASME Desgn Techncal Conference, Las Vegas, NV, Paper No. DAC Cheng, B. and Ttterngton, D. M., 994, "Neural Networks: A Revew from a Statstcal Perspectve," Statstcal Scence, Vol. 9, No., pp Cresse, N. A. C., 993, Statstcs for Spatal Data, Revsed, John Wley & Sons, New York. Dyn, N., Levn, D. and Rppa, S., 986, "Numercal Procedures for Surface Fttng of Scattered Data by Radal Bass Functons," SIAM Journal of Scentfc and Statstcal Computng, Vol. 7, No., pp Engelund, W. C., Douglas, O. S., Lepsch, R. A., McMllan, M. M. and Unal, R., 993, "Aerodynamc Confguraton Desgn Usng Response Surface Methodology Analyss," AIAA Arcraft Desgn, Sys. & Oper. Mngmt., Monterey, CA, Paper Fredman, J. H., 99, Multvarate Adaptve Regresson Splnes, The Annals of Statstcs, 9(, pp. -4. Gunta, A.A., Dudley, J.M., Narducc, R., Grossman, B., Haftka, R.T., Mason, W.H, and Watson, L. T., 994, Nosy Aerodynamc Response and Smooth Approxmaton n HSCT Desgn, Proceedngs Analyss and Optmzaton (Panama Cty, FL, Vol., AIAA, Washngton, DC, pp Gunta, A., Watson, L. T. and Koehler, J., 998, September -4, "A Comparson of Approxmaton Modelng Technques: Polynomal Versus Interpolatng Models," 7th AIAA/USAF/NASA/ISSMO Symposum on Multd scplnary Analyss & Optmzaton, St. Lous, MO, AIAA, Vol., pp AIAA Hardy, R.L., 97, Multquadratc Equatons of Topography and Other Irregular Surfaces, J. Geophys. Res., Vol. 76, pp Hock and Schttkows 98, Test Examples for Nonlnear Programmng Codes, New York: Sprnger-Verlag. Kaufman, M., Balabanov, V., Burgee, S.L., Gunta, A.A., Grossman, B., Mason, W. H., and Watson, L. T., 996, Varable-Complexty Response Surface Approxmatons for Wng Structural Weght n HSCT Desgn, 34th Aerospace Scences Meetng and Exhbt, Reno, NV, AIAA Paper Klenen, J.P.C., 987, Statstcall Tools for Smulaton Practtoners, Marcel Dekker, NY. McKay, M.D., Beckman, R.J. and Conover, W.J., 979, A comparson of three methods for selectng values of nput varables n the analyss of output from a computer code, Technometrcs, (, pp Meckeshemer, M., Barton, R. R., Lmayem, F. and Yannou, B., 000, "Metamodelng of Combned Dscrete/Contnuous Responses," Desgn Theory and Methodology DTM 00 (Allen, J.K., Ed., ASME, Baltmore, MD, Paper No. DETC000/DTM Myers, R. H. and Montgomery, D. C., 995, Response Surface Methodology: Process and Product Optmzaton Usng Desgned Experments, Wley & Sons, New York. Powell, M. J. D., 987, "Radal Bass Functons for Multvarable Interpolaton: a Revew," Algorthms for Approxmaton (J.C. Mason and M.G. Cox, Eds., London, Oxford Unversty Press. Sacks, J., Welch, W. J., Mtchell, T. J. and Wynn, H. P., 989, "Desgn and Analyss of Computer Experments," Statstcal Scence, 4(4, pp Sayers, M. W. and Rley, S. M., 996, Modelng Assumptons for Realstc Multbody Smulatons of the Yaw and Roll Behavor of Heavy Trucks, SAE Paper No 96073, Socety of Automotve Engneers, Warrendale, PA. Smpson, T. W., Mauery, T. M., Korte, J. J. and Mstree, F., 998, September -4, "Comparson of Response Surface and Krgng Models for Multdscplnary Desgn Optmzaton," 7th AIAA/USAF/NASA/ISSMO Symposum on Multdscplnary Analyss & Optmzaton, St. Lous, MO, AIAA, Vol., pp AIAA Smpson, T. W., 999, November 7-, "A Comparson of Metamodelng Strateges for Computer-Based Engneerng Desgn," INFORMS Phladelpha Fall 999 Meetng, Phladelpha, PA, INFORMS. Smpson, T. W., Peplns J., Koch, P. N. and Allen, J. K., 997, "On the Use of Statstcs n Desgn and the Implcatons for Determnstc Computer Experments," Desgn Theory and Methodology - DTM'97, Sacramento, CA, ASME, Paper No. DETC97/DTM

11 Sobeszczansk-Sobes J. and Haftka, R. T., 997, "Multdscplnary Aerospace Desgn Optmzaton: Survey of Recent Developments," Structural Optmzaton, 4, pp. -3. Smth, M., 993, Neural Networks for Statstcal Modelng, Von Nostrand Renhold, New York. Tu, C. H. and Barton, R.R. (997, Producton Yeld Estmaton by the Metamodel Method wth a Boundary focused Experment Desgn, Desgn Theory and Methodology Conference DTM 97, DETC97/DTM3870. Unal, R., Lepsch, R. A., Engelund, W. and Stanley, D. O., 996, "Approxmaton Model Buldng and Multdscplnary Desgn Optmzaton Usng Response Surface Methods," 6th AIAA/USAF/NASA/ISSMO Symposum on Multdscplnary Analyss and Optmzaton, Bellevue, WA, AIAA, Vol., pp Varadaraan, S., Chen, W., and Pelka, C., 000, The Robust Concept Exploraton Method wth Enhanced Model Approxmaton Capabltes, Engneerng Optmzaton, 3(3, Venter, G., Haftka, R. T. and Starnes, J. H., Jr., 996, "Constructon of Response Surfaces for Desgn Optmzaton Applcatons," 6th AIAA/USAF/NASA/ISSMO Symposum on Multdscplnary Analyss and Optmzaton Bellevue, WA, AIAA, Vol., pp Wang, X., Lu, Y. and Antonsson, E. K., 999, September -5, "Fttng Functons to Data n Hgh Dmensonal Desgn Spaces," Advances n Desgn Automaton, Las Vegas, NV, ASME, Paper No. DETC99/DAC-86. Welch, W. J., Buc R. J., Sacks, J., Wynn, H. P., Mtchell, T. J. and Morrs, M. D., 99, "Screenng, Predctng, and Computer Experments," Technometrcs, 34(, pp Yang, R.J., Gu, L., Law, L., Gearhart, C., Tho, C.H., Lu, X., and Wang, B.P., 000, Approxmatons for Safety Optmzaton of Systems, ASME Desgn Automaton Conference, September 0-3, Baltmore, MD, Paper No. DETC-00/DAC f ( x = [(ln( x. x 9.9 Appendx A Test Problems (ln( 0 x ] = = ( x 0 x. f ( x = x ( c + ln, c = ,-7.64,- = x x ,-5.94,-4.7,-4.986,-4.00,-0.708,-6.66,-.79; =,,, f ( x = exp( x ( c + x ln( exp( x, 0 = k= c = ,-7.64, ,-5.94,-4.7,-4.986,-4.00, ,-6.66,-.79; =,,,0. 4. f ( x = x + x + x x 4x 6x + ( x 0 + 4( x 5 + ( x 3 + ( x6 + 5x7 + 7( x8 + ( x9 0 + ( x f ( x = a ( x + x + ( x + x +,, =,,, 6. = = [a ]= [ k ]; x 6. f ( x = (30+ x *sn( x *(4 + exp( 7. f x = sn( πx /cos( πx /6 ( 8. f x = sn( x + x + ( x x.5x +.5x ( + f ( x = ( x x f ( x = =,..., ( x x + ( x 3 99 u = 5+ f ( x f( x = exp( ( u x x = ( 50ln(.0 / 3 x3. f x = x x x x ( 3. f ( x = 0.5x + x xx 7x 7x 3. f ( x = 0.5x + x xx 7x 7x + ε( x, x where ε(x,x s a normal nose. The mean s equal to 0 and the varance s equal to /00 of that of the smooth part of f(x. When - 5 x,x 5, the varance s of the nose: /00*54.9=.55. a. P (x, x 3 other x =3 b. P3 (x, x 3 other x =3 c. P6 (x, x d. P9 (x, x 3 x =0 80 Rollover 60 Metrc 40 (degree Brake 80 Level (ps Steer level (degree e. P0 (x, x 3 x =.8 f. P4 Vehcle handlng Fgure A. Grd Plots of Hghly Nonlnear Models (P stands for Problem

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

Chapter 11: Simple Linear Regression and Correlation

Chapter 11: Simple Linear Regression and Correlation Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

Originated from experimental optimization where measurements are very noisy Approximation can be actually more accurate than

Originated from experimental optimization where measurements are very noisy Approximation can be actually more accurate than Surrogate (approxmatons) Orgnated from expermental optmzaton where measurements are ver nos Approxmaton can be actuall more accurate than data! Great nterest now n applng these technques to computer smulatons

More information

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Polynomial Regression Models

Polynomial Regression Models LINEAR REGRESSION ANALYSIS MODULE XII Lecture - 6 Polynomal Regresson Models Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Test of sgnfcance To test the sgnfcance

More information

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

2016 Wiley. Study Session 2: Ethical and Professional Standards Application 6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

Chapter 6. Supplemental Text Material

Chapter 6. Supplemental Text Material Chapter 6. Supplemental Text Materal S6-. actor Effect Estmates are Least Squares Estmates We have gven heurstc or ntutve explanatons of how the estmates of the factor effects are obtaned n the textboo.

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Statistics for Economics & Business

Statistics for Economics & Business Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

Uncertainty and auto-correlation in. Measurement

Uncertainty and auto-correlation in. Measurement Uncertanty and auto-correlaton n arxv:1707.03276v2 [physcs.data-an] 30 Dec 2017 Measurement Markus Schebl Federal Offce of Metrology and Surveyng (BEV), 1160 Venna, Austra E-mal: markus.schebl@bev.gv.at

More information

Uncertainty as the Overlap of Alternate Conditional Distributions

Uncertainty as the Overlap of Alternate Conditional Distributions Uncertanty as the Overlap of Alternate Condtonal Dstrbutons Olena Babak and Clayton V. Deutsch Centre for Computatonal Geostatstcs Department of Cvl & Envronmental Engneerng Unversty of Alberta An mportant

More information

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1 On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool

More information

DETERMINATION OF TEMPERATURE DISTRIBUTION FOR ANNULAR FINS WITH TEMPERATURE DEPENDENT THERMAL CONDUCTIVITY BY HPM

DETERMINATION OF TEMPERATURE DISTRIBUTION FOR ANNULAR FINS WITH TEMPERATURE DEPENDENT THERMAL CONDUCTIVITY BY HPM Ganj, Z. Z., et al.: Determnaton of Temperature Dstrbuton for S111 DETERMINATION OF TEMPERATURE DISTRIBUTION FOR ANNULAR FINS WITH TEMPERATURE DEPENDENT THERMAL CONDUCTIVITY BY HPM by Davood Domr GANJI

More information

Chapter 5 Multilevel Models

Chapter 5 Multilevel Models Chapter 5 Multlevel Models 5.1 Cross-sectonal multlevel models 5.1.1 Two-level models 5.1.2 Multple level models 5.1.3 Multple level modelng n other felds 5.2 Longtudnal multlevel models 5.2.1 Two-level

More information

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010 Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6 Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

Report on Image warping

Report on Image warping Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH Turbulence classfcaton of load data by the frequency and severty of wnd gusts Introducton Oscar Moñux, DEWI GmbH Kevn Blebler, DEWI GmbH Durng the wnd turbne developng process, one of the most mportant

More information

Statistics for Business and Economics

Statistics for Business and Economics Statstcs for Busness and Economcs Chapter 11 Smple Regresson Copyrght 010 Pearson Educaton, Inc. Publshng as Prentce Hall Ch. 11-1 11.1 Overvew of Lnear Models n An equaton can be ft to show the best lnear

More information

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS IJRRAS 8 (3 September 011 www.arpapress.com/volumes/vol8issue3/ijrras_8_3_08.pdf NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS H.O. Bakodah Dept. of Mathematc

More information

Response Surface Methods Applied to Scarce and Small Sets of Training Points A Comparative Study

Response Surface Methods Applied to Scarce and Small Sets of Training Points A Comparative Study EngOpt 8 - Internatonal Conference on Engneerng Optmzaton Ro de Janero, Brazl, - 5 June 8. Response Surface Methods Appled to Scarce and Small Sets of Tranng Ponts A Comparatve Study COLAÇO, M. J., SILVA,

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:

More information

ANOMALIES OF THE MAGNITUDE OF THE BIAS OF THE MAXIMUM LIKELIHOOD ESTIMATOR OF THE REGRESSION SLOPE

ANOMALIES OF THE MAGNITUDE OF THE BIAS OF THE MAXIMUM LIKELIHOOD ESTIMATOR OF THE REGRESSION SLOPE P a g e ANOMALIES OF THE MAGNITUDE OF THE BIAS OF THE MAXIMUM LIKELIHOOD ESTIMATOR OF THE REGRESSION SLOPE Darmud O Drscoll ¹, Donald E. Ramrez ² ¹ Head of Department of Mathematcs and Computer Studes

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

Durban Watson for Testing the Lack-of-Fit of Polynomial Regression Models without Replications

Durban Watson for Testing the Lack-of-Fit of Polynomial Regression Models without Replications Durban Watson for Testng the Lack-of-Ft of Polynomal Regresson Models wthout Replcatons Ruba A. Alyaf, Maha A. Omar, Abdullah A. Al-Shha ralyaf@ksu.edu.sa, maomar@ksu.edu.sa, aalshha@ksu.edu.sa Department

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

Suppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl

Suppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl RECURSIVE SPLINE INTERPOLATION METHOD FOR REAL TIME ENGINE CONTROL APPLICATIONS A. Stotsky Volvo Car Corporaton Engne Desgn and Development Dept. 97542, HA1N, SE- 405 31 Gothenburg Sweden. Emal: astotsky@volvocars.com

More information

This column is a continuation of our previous column

This column is a continuation of our previous column Comparson of Goodness of Ft Statstcs for Lnear Regresson, Part II The authors contnue ther dscusson of the correlaton coeffcent n developng a calbraton for quanttatve analyss. Jerome Workman Jr. and Howard

More information

Note 10. Modeling and Simulation of Dynamic Systems

Note 10. Modeling and Simulation of Dynamic Systems Lecture Notes of ME 475: Introducton to Mechatroncs Note 0 Modelng and Smulaton of Dynamc Systems Department of Mechancal Engneerng, Unversty Of Saskatchewan, 57 Campus Drve, Saskatoon, SK S7N 5A9, Canada

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

One-sided finite-difference approximations suitable for use with Richardson extrapolation

One-sided finite-difference approximations suitable for use with Richardson extrapolation Journal of Computatonal Physcs 219 (2006) 13 20 Short note One-sded fnte-dfference approxmatons sutable for use wth Rchardson extrapolaton Kumar Rahul, S.N. Bhattacharyya * Department of Mechancal Engneerng,

More information

Application of B-Spline to Numerical Solution of a System of Singularly Perturbed Problems

Application of B-Spline to Numerical Solution of a System of Singularly Perturbed Problems Mathematca Aeterna, Vol. 1, 011, no. 06, 405 415 Applcaton of B-Splne to Numercal Soluton of a System of Sngularly Perturbed Problems Yogesh Gupta Department of Mathematcs Unted College of Engneerng &

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands Content. Inference on Regresson Parameters a. Fndng Mean, s.d and covarance amongst estmates.. Confdence Intervals and Workng Hotellng Bands 3. Cochran s Theorem 4. General Lnear Testng 5. Measures of

More information

/ n ) are compared. The logic is: if the two

/ n ) are compared. The logic is: if the two STAT C141, Sprng 2005 Lecture 13 Two sample tests One sample tests: examples of goodness of ft tests, where we are testng whether our data supports predctons. Two sample tests: called as tests of ndependence

More information

and V is a p p positive definite matrix. A normal-inverse-gamma distribution.

and V is a p p positive definite matrix. A normal-inverse-gamma distribution. OSR Journal of athematcs (OSR-J) e-ssn: 78-578, p-ssn: 39-765X. Volume 3, ssue 3 Ver. V (ay - June 07), PP 68-7 www.osrjournals.org Comparng The Performance of Bayesan And Frequentst Analyss ethods of

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

x i1 =1 for all i (the constant ).

x i1 =1 for all i (the constant ). Chapter 5 The Multple Regresson Model Consder an economc model where the dependent varable s a functon of K explanatory varables. The economc model has the form: y = f ( x,x,..., ) xk Approxmate ths by

More information

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA 4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected

More information

Supplementary Notes for Chapter 9 Mixture Thermodynamics

Supplementary Notes for Chapter 9 Mixture Thermodynamics Supplementary Notes for Chapter 9 Mxture Thermodynamcs Key ponts Nne major topcs of Chapter 9 are revewed below: 1. Notaton and operatonal equatons for mxtures 2. PVTN EOSs for mxtures 3. General effects

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Experment-I MODULE VII LECTURE - 3 ANALYSIS OF COVARIANCE Dr Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Any scentfc experment s performed

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 31 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 6. Rdge regresson The OLSE s the best lnear unbased

More information

Lab 2e Thermal System Response and Effective Heat Transfer Coefficient

Lab 2e Thermal System Response and Effective Heat Transfer Coefficient 58:080 Expermental Engneerng 1 OBJECTIVE Lab 2e Thermal System Response and Effectve Heat Transfer Coeffcent Warnng: though the experment has educatonal objectves (to learn about bolng heat transfer, etc.),

More information

Chapter 6. Supplemental Text Material. Run, i X i1 X i2 X i1 X i2 Response total (1) a b ab

Chapter 6. Supplemental Text Material. Run, i X i1 X i2 X i1 X i2 Response total (1) a b ab Chapter 6. Supplemental Text Materal 6-. actor Effect Estmates are Least Squares Estmates We have gven heurstc or ntutve explanatons of how the estmates of the factor effects are obtaned n the textboo.

More information

ISSN: ISO 9001:2008 Certified International Journal of Engineering and Innovative Technology (IJEIT) Volume 3, Issue 1, July 2013

ISSN: ISO 9001:2008 Certified International Journal of Engineering and Innovative Technology (IJEIT) Volume 3, Issue 1, July 2013 ISSN: 2277-375 Constructon of Trend Free Run Orders for Orthogonal rrays Usng Codes bstract: Sometmes when the expermental runs are carred out n a tme order sequence, the response can depend on the run

More information

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence

More information

Inductance Calculation for Conductors of Arbitrary Shape

Inductance Calculation for Conductors of Arbitrary Shape CRYO/02/028 Aprl 5, 2002 Inductance Calculaton for Conductors of Arbtrary Shape L. Bottura Dstrbuton: Internal Summary In ths note we descrbe a method for the numercal calculaton of nductances among conductors

More information

DUE: WEDS FEB 21ST 2018

DUE: WEDS FEB 21ST 2018 HOMEWORK # 1: FINITE DIFFERENCES IN ONE DIMENSION DUE: WEDS FEB 21ST 2018 1. Theory Beam bendng s a classcal engneerng analyss. The tradtonal soluton technque makes smplfyng assumptons such as a constant

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

Statistical Energy Analysis for High Frequency Acoustic Analysis with LS-DYNA

Statistical Energy Analysis for High Frequency Acoustic Analysis with LS-DYNA 14 th Internatonal Users Conference Sesson: ALE-FSI Statstcal Energy Analyss for Hgh Frequency Acoustc Analyss wth Zhe Cu 1, Yun Huang 1, Mhamed Soul 2, Tayeb Zeguar 3 1 Lvermore Software Technology Corporaton

More information

4.3 Poisson Regression

4.3 Poisson Regression of teratvely reweghted least squares regressons (the IRLS algorthm). We do wthout gvng further detals, but nstead focus on the practcal applcaton. > glm(survval~log(weght)+age, famly="bnomal", data=baby)

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

See Book Chapter 11 2 nd Edition (Chapter 10 1 st Edition)

See Book Chapter 11 2 nd Edition (Chapter 10 1 st Edition) Count Data Models See Book Chapter 11 2 nd Edton (Chapter 10 1 st Edton) Count data consst of non-negatve nteger values Examples: number of drver route changes per week, the number of trp departure changes

More information

A Comparative Study for Estimation Parameters in Panel Data Model

A Comparative Study for Estimation Parameters in Panel Data Model A Comparatve Study for Estmaton Parameters n Panel Data Model Ahmed H. Youssef and Mohamed R. Abonazel hs paper examnes the panel data models when the regresson coeffcents are fxed random and mxed and

More information

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.

More information

Development of a Semi-Automated Approach for Regional Corrector Surface Modeling in GPS-Levelling

Development of a Semi-Automated Approach for Regional Corrector Surface Modeling in GPS-Levelling Development of a Sem-Automated Approach for Regonal Corrector Surface Modelng n GPS-Levellng G. Fotopoulos, C. Kotsaks, M.G. Sders, and N. El-Shemy Presented at the Annual Canadan Geophyscal Unon Meetng

More information

Chapter 15 Student Lecture Notes 15-1

Chapter 15 Student Lecture Notes 15-1 Chapter 15 Student Lecture Notes 15-1 Basc Busness Statstcs (9 th Edton) Chapter 15 Multple Regresson Model Buldng 004 Prentce-Hall, Inc. Chap 15-1 Chapter Topcs The Quadratc Regresson Model Usng Transformatons

More information

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS M. Krshna Reddy, B. Naveen Kumar and Y. Ramu Department of Statstcs, Osmana Unversty, Hyderabad -500 007, Inda. nanbyrozu@gmal.com, ramu0@gmal.com

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) ,

A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) , A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS Dr. Derald E. Wentzen, Wesley College, (302) 736-2574, wentzde@wesley.edu ABSTRACT A lnear programmng model s developed and used to compare

More information

Basic Business Statistics, 10/e

Basic Business Statistics, 10/e Chapter 13 13-1 Basc Busness Statstcs 11 th Edton Chapter 13 Smple Lnear Regresson Basc Busness Statstcs, 11e 009 Prentce-Hall, Inc. Chap 13-1 Learnng Objectves In ths chapter, you learn: How to use regresson

More information

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

STAT 3008 Applied Regression Analysis

STAT 3008 Applied Regression Analysis STAT 3008 Appled Regresson Analyss Tutoral : Smple Lnear Regresson LAI Chun He Department of Statstcs, The Chnese Unversty of Hong Kong 1 Model Assumpton To quantfy the relatonshp between two factors,

More information

Testing for seasonal unit roots in heterogeneous panels

Testing for seasonal unit roots in heterogeneous panels Testng for seasonal unt roots n heterogeneous panels Jesus Otero * Facultad de Economía Unversdad del Rosaro, Colomba Jeremy Smth Department of Economcs Unversty of arwck Monca Gulett Aston Busness School

More information

is the calculated value of the dependent variable at point i. The best parameters have values that minimize the squares of the errors

is the calculated value of the dependent variable at point i. The best parameters have values that minimize the squares of the errors Multple Lnear and Polynomal Regresson wth Statstcal Analyss Gven a set of data of measured (or observed) values of a dependent varable: y versus n ndependent varables x 1, x, x n, multple lnear regresson

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

The Ordinary Least Squares (OLS) Estimator

The Ordinary Least Squares (OLS) Estimator The Ordnary Least Squares (OLS) Estmator 1 Regresson Analyss Regresson Analyss: a statstcal technque for nvestgatng and modelng the relatonshp between varables. Applcatons: Engneerng, the physcal and chemcal

More information

Statistics MINITAB - Lab 2

Statistics MINITAB - Lab 2 Statstcs 20080 MINITAB - Lab 2 1. Smple Lnear Regresson In smple lnear regresson we attempt to model a lnear relatonshp between two varables wth a straght lne and make statstcal nferences concernng that

More information

PHYS 450 Spring semester Lecture 02: Dealing with Experimental Uncertainties. Ron Reifenberger Birck Nanotechnology Center Purdue University

PHYS 450 Spring semester Lecture 02: Dealing with Experimental Uncertainties. Ron Reifenberger Birck Nanotechnology Center Purdue University PHYS 45 Sprng semester 7 Lecture : Dealng wth Expermental Uncertantes Ron Refenberger Brck anotechnology Center Purdue Unversty Lecture Introductory Comments Expermental errors (really expermental uncertantes)

More information

Regularized Discriminant Analysis for Face Recognition

Regularized Discriminant Analysis for Face Recognition 1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Lossy Compression. Compromise accuracy of reconstruction for increased compression. Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs

More information

Number of cases Number of factors Number of covariates Number of levels of factor i. Value of the dependent variable for case k

Number of cases Number of factors Number of covariates Number of levels of factor i. Value of the dependent variable for case k ANOVA Model and Matrx Computatons Notaton The followng notaton s used throughout ths chapter unless otherwse stated: N F CN Y Z j w W Number of cases Number of factors Number of covarates Number of levels

More information

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression 11 MACHINE APPLIED MACHINE LEARNING LEARNING MACHINE LEARNING Gaussan Mture Regresson 22 MACHINE APPLIED MACHINE LEARNING LEARNING Bref summary of last week s lecture 33 MACHINE APPLIED MACHINE LEARNING

More information