SEASONAL TIME SERIES PREDICTION WITH ARTIFICIAL NEURAL NETWORKS AND LOCAL MEASURES. R. Pinto, S. Cavalieri

Size: px
Start display at page:

Download "SEASONAL TIME SERIES PREDICTION WITH ARTIFICIAL NEURAL NETWORKS AND LOCAL MEASURES. R. Pinto, S. Cavalieri"

Transcription

1 SEASONAL TIME SERIES PREDICTION WITH ARTIFICIAL NEURAL NETWORKS AND LOCAL MEASURES R. Pnto, S. Cavaler Unversty of Bergamo, Department of Industral Engneerng, Italy vale Marcon, 5. I Dalmne BG Ph.: E-mal: Abstract: Forecastng s one of the most challengng felds n the ndustral research, due to ts mportance n practce and to the varablty and number of elements that should be consdered. In ths context, the usage of Artfcal Neural Networks has proved to be partcularly advsable, thanks to ther ablty to approxmate any knd of functon wthn a desrable range. Whle most of the lterature concerns wth the defnton of the characterstcs of an ANN, there s only a few number of contrbutons that address the pre-analyss of the data, and seems there s no recent work about the pre-processng of the patterns to submt as nput to the ANN. The am of ths paper s to propose a novel approach to the tme seres forecastng actvtes through the dentfcaton and explotaton of nformaton hdden or latent nto the values and the structure of a seasonal tme seres. In order to do ths, the tme seres s decomposed nto parts, for each of whch some local measures are evaluated: such measures are ntended to mprove the forecastng ablty of the ANN. Moreover, to explot the regularty of a seasonal tme seres, the concept of Seasonal Perodc Index (SPI) has been ntroduced. Results obtaned from the tests confrm the effectveness of the usage of the local measures and of the SPI. Copyrght 2005 IFAC Keywords: Artfcal Neural Networks, Forecastng, Local measures, Seasonal tme seres, Levenberg-Marquardt learnng algorthm,. INTRODUCTION Forecastng s one of the most challengng felds n the ndustral research, manly due to ts mportance n any demand plannng process and to the varablty and number of elements that should be consdered n the mplementaton of an effectve predcton process, supported by sutable methods and technques. Forecastng technques range from quanttatve to subjectve evaluaton, dependng on the objectve of the forecast actvty and the type of predcton beng obtaned. Focusng on quanttatve methods, two man classes can be sngled out: Extrapolatve methods: these methods am to fnd some characterstcs of a set of sequental observatons (referred to as tme seres) that show a repettve behavour. Exponental smoothng technques are an example of these methods. Explcatve methods: these methods correlate one or more ndependent varables to one observed varable (.e. dependent from the other varables), whch has to be predcted. Examples are smple and multple regresson models. In ths context, Artfcal Neural Networks (ANN), whose applcaton n forecastng s dscussed n ths paper, are quanttatve technques whch can be used ether as unversal regressors thanks to ther ablty to approxmate almost any knd of functon (manly, feedforward (Thesng, and Vornberger, 997; Crone, 2002) and Radal Bass Functon networks (Cocou,

2 998; Zemour, et al., 2003) or extrapolatve methods, due to ther feature to recognze pattern s characterstcs through tme (especally wth tme delay (Conway, et al., 998) and recurrent neural networks. Whle most of the lterature concern wth the characterstcs of an ANN (Hegazy and Salama, 995).e. topology, number of layers, number of neurons per layer, actvaton functons an so on there are only a few contrbutons that address the pre-analyss of the data. For example, the analyss of the nput data could be performed through the Prncpal Component Analyss, n order to reduce the sze of the nput space (Cloarec and Rngwood, 998) and avod the curse of dmensonalty (the exponental growng calculaton effort needed as the nput space grows). On the other hand, t seems that there s no recent work about the formaton process of the patterns to submt as nput to the ANN. The dea at the bass of ths work s that nput patterns should be carefully organzed n order to contan not only the values of the tme seres as done generally n most applcatons but also other nformaton gathered from the tme seres tself. Ths nformaton could be obtaned through an opportune pre-processng actvty performed on the data. In the followng of ths paper, an nput pattern defnton process s proposed, n order to enrch the nformaton that the ANN could explot for the forecastng. More precsely, the paper s organzed as follows: secton 2 descrbes the dfferences between the tradtonal approach to forecastng wth ANN and the proposed new approach; secton 3 presents the assumpton at the bass of the proposed forecastng model and the structure of the nput pattern, characterzed by the concept of local measures. In secton 4 the defnton process of the nput pattern s presented: ths process s the bass of the forecastng model, ntroducng the concept of Seasonal Perod Index. Secton 5 brefly exposes the structure of the ANN, whle secton 6 reports about the results obtaned durng the applcaton of the model. Fnally, n secton 7 some concluson remarks and further extensons are proposed. 2. FORECASTING TIME SERIES USING NEURAL NETWORK The forecastng of a tme seres through an ANN could be performed n several ways, as stated n the prevous secton. The general approach (wth the excepton of the usage of tme delay ANNs) conssts n the parttonng of the tme seres n a set of equally szed patterns.e. the same number of observatons per pattern that are presented to the ANN durng the learnng phase, together wth the expected response. Ths approach s based on the assumpton that each observaton A depends on the prevous t values of the tme seres A -, A -2,,A -t. Startng from ths approach, the dea at the bass of the model proposed n ths work s to mprove the performance of the ANN usng nformaton that s not explctly represented by the tme seres values, but s hdden both n the values and n the structure of the tme seres. In order to accomplsh ths, the assumpton becomes more extended: each observaton A of the tme seres could be seen as a combnaton of the nformaton somewhat connected or represented by prevous t observatons. It s mportant to note that we used the term nformaton, whch mples only an ndrect reference to the numercal values of the observaton, whle referrng strongly to other type of knowledge that s consdered embedded but hdden n the tme seres. Stated ths assumpton, the key of the forecast task becomes the dentfcaton and the explotaton of such nformaton. For example, the value of the next observaton could be related to the mean of prevous t values (as n the movng average methods): the mean of the t values s a pece of nformaton whch s dfferent from the numercal value of the observatons. The reference to the movng average methods s sgnfcant also to explan another extenson to the startng assumpton, that seems to be reasonable: snce the movng average consders only the last t values of the tme seres, we could assume that the nformaton s local,.e. s related only to a pece of the tme seres, and could be dfferent dependng on the part consdered (see for example Nottngham and Cook, 200) For all these reasons, the proposed model, nstead of focusng the attenton on the structure and the characterstcs that the ANN should have n order to perform adequately, focuses ts attenton on the data (.e. the tme seres n nput) and on the nformaton that could be extrapolated from them, n order to support the forecastng actvty. In partcular, the attenton s concentrated on seasonal tme seres (wth and wthout trend), because of the ntrnsc dffculty nvolved n the predcton of such seres. 3. THE ASSUMPTION OF THE FORECASTING MODEL Each forecastng technque s based on some assumptons and hypothess: our assumptons (or hypothess) could be stated as follows: a) Any behavour of the tme seres observed n the past wll repeat tself n the future. b) Any observaton of the tme seres could be seen as the result of the combnaton of the nformaton embedded mplctly or explctly n prevous observatons. c) Such nformaton s local to a partcular subset of the avalable tme seres,.e. such nformaton changes dependng on the tme seres under analyss.

3 d) There s a totally random fluctuaton that could not be reasonably elmnated, but the entre seres s not completely random (Not Random Walk Hypothess) The frst assumpton s the well-known Hypothess of Contnuty, whch s the base for almost all the forecastng methods based on tme seres analyss: we could only predct the future studyng the past and assumng that the past wll repeat tself n the future, hopefully almost unvared. The second assumpton s derved from the so called Weak Form Effcent Market Theory (Fama, 970; Stte and Stte, 2002) whch the Random Walk model (mentoned n the fourth assumpton) s an extenson of. As explaned n secton 2, each observaton s seen not only as a combnaton of prevous values (such as n the movng average methods or n the exponental smoothng technque), but t s also nfluenced by the nformaton (.e. not explct, but n some case easly obtanable) embedded nto prevous values. Such nformaton s quanttatve, due to the mathematcal nature of the ANN. The thrd assumpton states that, f we consder only a subset of the tme seres, we could fnd n such a set enough nformaton to be used to forecast the next value of the seres. In other words, each observaton has n ts neghborhood a hdden, latent nformaton that could help n the predcton. Fnally, the fourth assumpton says that t s almost mpossble to predct exactly the values n a real tme seres, because of the presence of random and unknown hdden phenomena. It s dffcult to model or quantfy such elements, so the predcton should not be represented by a sngle value, but by a range of possble values. Gven these assumptons, n our work we attempted to mprove the forecastng results provded by a feedforward ANN by explotng nformaton embedded nto the tme seres. More precsely, we would say local nformaton, meanng a set of measures (as sad, the nformaton consdered s quanttatve) evaluated consderng only a few observatons at a tme. In practce, tme seres are parttoned nto subsets, each charactersed by the followng measures: () mean value; () varance; () slope of the regresson lne that fts the subset (referred to as regresson slope n the followng). Each subset, wth ts related local measures, s the bass for the constructon of the nput patterns for the network learnng process. 4. DEFINITION OF THE INPUT PATTERN The man task of the proposed model s the defnton of the nput patterns of the ANN, startng from the sequence of values of the tme seres. Before descrbng the structure of the nput patterns, t s useful to ntroduce some defntons: LS: the length of the tme seres (.e. the number of the observatons n the seres); W: the wndow extenson, meanng the number of tme seres observatons that consttute an nput pattern; L: the number of the observatons wthn a sngle seasonal cycle; for example, f the season s over one year and the observatons are monthly regstered, L wll be equal to 2; f, nstead, the observatons are gathered every 3 months, L wll be equal to 4. Startng from a seasonal tme seres of length LS, each nput pattern wll be formed by W sequental observatons, whle the target value s represented by the W+ value. As defned prevously, we make use of local nformaton for each pattern, represented by the mean and the varance of the W observatons of the patterns, plus the slope of the regresson lne that fts such W observatons. Moreover, n order to explot the regularty of seasonal tme seres, we ntroduce another element to the dscusson: f the tme seres has a seasonalty perod of L observatons (.e. the seres repeats tself every L observatons) we could provde the ANN wth such nformaton, thus reducng the tranng perod and mprovng the qualty of results. For ths reason, beyond the above llustrated measures, we have added a further set of nputs, namely the Seasonal Perod Index (SPI). Ths s an array of L bnary values, whch s subjected to the followng constrant: L = SPI[ ] = where SPI[] represents the seasonal array. Thus, for each pattern only one element of the SPI could be set to. The poston p of the array whch stores the non-zero value n a pattern s defned accordng to the followng expresson: ( ) p = h mod L + where h s the poston of the target value n the tme seres for the formng pattern (so h [ ; LS]). Obvously, p [ ; L]. As a result, we have a set of patterns composed by W+3+L elements. Fgure may help n understandng the process llustrated above. As shown, the frst pattern s composed by three sequental values of the tme seres (W = 3). The mean, the varance and the regresson slope of these three values are computed and compose the other three elements of the pattern. Most nterestng s the last part of the pattern: supposng that the season s over four perods, we add four bnary values as defned before.

4 LS MSW = n n 2 w = MEAN, VARIANCE AND REGRESSION SLOPE EVALUATION ,3 37, INPUT PATTERN MEAN W OBSERVATIONS VARIANCE REGRESSION SLOPE SEASONAL PERIOD INDEX 4 TARGET VALUE Fg.. The nput pattern defnton process In order to defne whch s the only non-zero value, we have to look at the target value: snce the target value s n the fourth poston (h = 4) and L = 4, the non-zero poston n the SPI s: ( 4) + P = 4 mod = So, the should be postoned n the frst cell of the SPI array. The second pattern wll have the n the second poston of the SPI array and so on. 5. THE ANN AND THE LEARNING ALGORITHM The ANN used n ths work s a feedforward backpropagaton network wth one hdden layer (wth logstcs actvaton functon) traned wth the Levenberg-Marquardt (LM) supervsed learnng algorthm. Ths algorthm, as the quas-newton methods, has been purposely desgned for provdng an approxmaton of the Hessan matrx of the performance functon, mprovng the speed of learnng, especally for moderate-szed networks. Ths algorthm revealed to be the best one n the test we conducted, both n the epochs needed for reachng the target error and n the qualty of the result, measured n terms of Mean Average Percentage Errore (MAPE): MAPE = o t t 00 where o s the output of the network and t s the target value, that s the desred value. Moreover, the LM algorthm s partcularly sutable whenever a relatvely small number of observatons s avalable. Two dfferent objectve functons have been tested: the smple Mean Squared Error (MSE) and the MSE wth regularzaton (MSEREG), as defned n the Matlab package: MSEREG = λ MSE + λ ( ) MSW where λ s called performance rato (λ [0 ; ]; f λ =, then MSEREG MSE) and w s the -th weght of the ANN. Ths functon, used as the objectve functon of the learnng phase, ams to reduce the weght of the network, forcng the network response to be smoother and avodng the rsk of data overfttng, that occurs f the ANN has memorzed well the patterns used for the tranng, but shows really poor performances on new data sets. 6. RESULTS The proposed approach has been tested over a number of seasonal sales tme seres, obtaned from varous sources. For the sake of clarty, only the test conducted over a seasonal tme seres wth trend (Fgure 2) s reported. Tests have been conducted over an ANN wth a hdden sgmodal layer of a varable number of neurons (rangng from 5 to 0), traned wth the same learnng algorthm (the LM) but wth dfferent nputs: Fg. 2. Sample of the data used for the test A) wth the set of pattern consttuted from the rollng W observaton of the tme seres; B) wth the set of pattern ncludng the W rollng observaton, the local measures and the SPI. The results of a seres of test are reported n Table, where NN_LM ndcates an ANN traned wth the patterns usng the local measures and SPI, constructed as llustrated n secton 4. Each network and pattern confguraton has been tested 7 tmes, and Table reports the mean, the mnmum and the maxmum MAPE reached both on tranng and test set. As could be seen, local regressons measures mprove the response of ANNs n almost all the tests conducted. Moreover, the usage of the MSEREG functon results n an mprovement of the generalzaton ablty of the network.

5 Table Results of the test Tranng Set Test Set NN_LM NN NN_LM NN Wndow Sze Objectve Mean Error (%) 9,42 7,63 3,9 25,65 Mn Error (%) 6,79 8,90 0,5 6,97 5 Max Error (%),00 30,68 6,44 49,5 Mean Error(%) 0,83 7,46 6,69 24,34 Mn Error (%) 9,37 3,65 4,42 8,62 Max Error (%) 5,34 2,5 24,28 3,42 Mean Error (%) 9,77 6,93 5,39 24,34 Mn Error (%) 6,25,95 9,79 5,4 Max Error (%) 4,8 23,80 22,89 36, MSEREG Mean Error (%) 0,06 32,02 5,90 54,35 Mn Error (%) 7,88 20,87,76 34,73 Max Error (%),89 38,98 9,5 66,73 Mean Error (%),5 27,76 8,34 46,82 Mn Error (%) 7,80 9,26,55 32,5 Max Error (%) 20,26 42,46 33,74 72,74 Mean Error (%) 3,48 2,7 22,3 36,5 Mn Error (%) 8,35, 3,3 7,49 Max Error (%) 2,38 34,27 36,27 58, MSE Fgure 3 reports more clearly the dfferences between the two approaches: wth the usage of local measures and SPI the forecast on the tranng set s more accurate. As stated before, ANN could approxmate arbtrarly well almost any knd of functon, gven a sutable set of data. avods the overfttng problem, but nothng could be sad precsely about the performance of the network when tested on a set of nputs that have not been presented to the ANN durng the tranng phase. Ths s the reason why a test set has also been used: the test set s made up by observatons that have not been presented to the ANN durng the tranng process. Fg. 3. Tme seres (dotted lne) versus forecast on the tranng set Most of the problems arse when the ANN has to generalze ts knowledge,.e. when the ANN has to predct the next value of a tme seres on the bass of nputs never seen before. The usage of the MSEREG Fg. 4. Tme seres (dotted lne) versus forecast on test set

6 As could be seen n Fgure 4, that depcts the forecast on the test set, the proposed model outperforms the normal pattern network also on the test set. Analysng the results of the test, the followng evdences emerge: The assumptons of the model are valdated: the assumptons stated n secton 3 has revealed to be reasonable; that s, n seasonal tme seres could be found a latent nformaton that could mprove the forecastng process. The local measures are approprate: n the proposed model, the mean, varance and regresson slope are adopted. Although these measures seem sutable to mprove the forecast, there could be other local measures that could be exploted. The usage of Seasonal Perod Indcator mproves the performances: provdng the nformaton about the perod to whch the forecast wll belong contrbutes to mprove the performances of the ANN. A smple network archtecture works fne: although t s possble to use dfferent typologes of ANN, the tests confrm that also a relatvely smple feedforward neural network can show good performances. The generalzaton ablty of the network s mproved: perhaps the most mportant result, the added nformaton provded by local measures and SPI mproves the forecastng ablty on new data. The model s robust aganst trend: ths result emerges from the comparson of the result swth the applcaton of de-seasonalzng methods: such methods, n fact, could perform well only f the tme seres s stable, whle the proposed method works fne also n presence of trend (as n the example of the tme seres depcted n Fgure 2) 7. CONCLUSION: REMARKS AND FURTHER EXTENSIONS In ths work, a novel approach to the usage of ANN for forecastng has been proposed. Instead of focusng the attenton on the characterstcs of the ANN, we have proposed a model whch ams to better explot the nformaton embedded nto the hstorcal data. The results prove that the usage of so called local measures (lke mean, varance and slope regresson) could mprove sgnfcantly the performance of an ANN. Moreover, the ntroducton of the Seasonal Perod Index (SPI) contrbutes to the effectveness of the model. In ths work, only a part of the possble measures and pattern confguraton has been examned. In order to generalze the results, some of the most challengng extensons of the present work could be the followng: the dentfcaton of the sutable structure of the ANN (n partcular, the number of the neurons of the hdden layer) by usng some correlaton measures of the characterstcs of the tme seres tself, n order to reduce the sze of the network and the tranng epochs; although the feedforward network wth LM algorthm performs well, could be nterestng to extend the analyss to other network archtectures, such as the Generalzed Regresson Neural Networks (GRNN) often used for functon approxmaton; another nterestng topc s the defnton of the extenson of the wndow W to consder n the transformaton of the tme seres nto nput pattern. Ths parameter could be lnked, for example, to the seasonalty of the tme seres or to other characterstcs; fnally, the proposed method could be extended to other non-seasonal tme seres, that could represent one of the most attractve felds of future research. REFERENCES Cocou, J.B., (998). Tme seres analyss usng RBF networks wth FIR/IIR synapses. Neurocomputng, 20, Cloarec, G.M., Rngwood, J., (998). Incorporaton of Statstcal Methods n Mult-step Neural Network Predcton Models /9, IEEE Conway, A.J., Macpherson, K.P., Brown, J.C., (998). Delayed tme seres predctons wth neural networks. Neurocomputng, 8, 8 89 Crone, S.F., (2002). Tranng artfcal neural networks for tme seres predcton usng asymmetrc cost functons. Techncal Workng Paper IWI-020, accepted at: ICONIP 2002 Proceedngs Fama, E.F., (970). Effcent captal market: A revew of theory and emprcal work. Journal of Fnance, 25, Hegazy, Y.G., Salama, M.M.A., (995). An Effcent Algorthm for Identfyng the Structure of Artfcal Neural Networks for Forecastng Problems /95, IEEE Nottngham, Q.J., Cook, D.F., (200). Local lnear regresson for estmatng tme seres data. Computatonal Statstc & Data Analyss, 37, Stte, J., Stte, R., (2002). Neural Networks Approach to the Random Walk Dlemma of Fnancal Tme Seres. Appled Intellgence, 6, 63 7 Thesng, F.M., Vornberger, O., (997). Sales forecastng Usng Neural Networks. Proceedngs ICNN 97, Huston, Texas, 9-2 June 997, vol. 4, pp , IEEE Zemour, R., Racoceanu, D., Zerhoun, N., (2003). Recurrent radal bass functon network for tmeseres predcton. Engneerng Applcaton of Artfcal Intellgence, 6, au/~hyndman/tsdl/

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have

More information

Chapter 11: Simple Linear Regression and Correlation

Chapter 11: Simple Linear Regression and Correlation Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

Polynomial Regression Models

Polynomial Regression Models LINEAR REGRESSION ANALYSIS MODULE XII Lecture - 6 Polynomal Regresson Models Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Test of sgnfcance To test the sgnfcance

More information

Statistics II Final Exam 26/6/18

Statistics II Final Exam 26/6/18 Statstcs II Fnal Exam 26/6/18 Academc Year 2017/18 Solutons Exam duraton: 2 h 30 mn 1. (3 ponts) A town hall s conductng a study to determne the amount of leftover food produced by the restaurants n the

More information

Solving Nonlinear Differential Equations by a Neural Network Method

Solving Nonlinear Differential Equations by a Neural Network Method Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

Statistics for Economics & Business

Statistics for Economics & Business Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6 Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) ,

A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) , A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS Dr. Derald E. Wentzen, Wesley College, (302) 736-2574, wentzde@wesley.edu ABSTRACT A lnear programmng model s developed and used to compare

More information

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS M. Krshna Reddy, B. Naveen Kumar and Y. Ramu Department of Statstcs, Osmana Unversty, Hyderabad -500 007, Inda. nanbyrozu@gmal.com, ramu0@gmal.com

More information

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH Turbulence classfcaton of load data by the frequency and severty of wnd gusts Introducton Oscar Moñux, DEWI GmbH Kevn Blebler, DEWI GmbH Durng the wnd turbne developng process, one of the most mportant

More information

Durban Watson for Testing the Lack-of-Fit of Polynomial Regression Models without Replications

Durban Watson for Testing the Lack-of-Fit of Polynomial Regression Models without Replications Durban Watson for Testng the Lack-of-Ft of Polynomal Regresson Models wthout Replcatons Ruba A. Alyaf, Maha A. Omar, Abdullah A. Al-Shha ralyaf@ksu.edu.sa, maomar@ksu.edu.sa, aalshha@ksu.edu.sa Department

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

Bayesian predictive Configural Frequency Analysis

Bayesian predictive Configural Frequency Analysis Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger JAB Chan Long-tal clams development ASTIN - September 2005 B.Verder A. Klnger Outlne Chan Ladder : comments A frst soluton: Munch Chan Ladder JAB Chan Chan Ladder: Comments Black lne: average pad to ncurred

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17 Neural Networks Perceptrons and Backpropagaton Slke Bussen-Heyen Unverstät Bremen Fachberech 3 5th of Novemeber 2012 Neural Networks 1 / 17 Contents 1 Introducton 2 Unts 3 Network structure 4 Snglelayer

More information

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,

More information

THE SUMMATION NOTATION Ʃ

THE SUMMATION NOTATION Ʃ Sngle Subscrpt otaton THE SUMMATIO OTATIO Ʃ Most of the calculatons we perform n statstcs are repettve operatons on lsts of numbers. For example, we compute the sum of a set of numbers, or the sum of the

More information

Statistical Evaluation of WATFLOOD

Statistical Evaluation of WATFLOOD tatstcal Evaluaton of WATFLD By: Angela MacLean, Dept. of Cvl & Envronmental Engneerng, Unversty of Waterloo, n. ctober, 005 The statstcs program assocated wth WATFLD uses spl.csv fle that s produced wth

More information

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA 4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Experment-I MODULE VII LECTURE - 3 ANALYSIS OF COVARIANCE Dr Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Any scentfc experment s performed

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 31 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 6. Rdge regresson The OLSE s the best lnear unbased

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

2016 Wiley. Study Session 2: Ethical and Professional Standards Application 6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

Fuzzy Boundaries of Sample Selection Model

Fuzzy Boundaries of Sample Selection Model Proceedngs of the 9th WSES Internatonal Conference on ppled Mathematcs, Istanbul, Turkey, May 7-9, 006 (pp309-34) Fuzzy Boundares of Sample Selecton Model L. MUHMD SFIIH, NTON BDULBSH KMIL, M. T. BU OSMN

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

This column is a continuation of our previous column

This column is a continuation of our previous column Comparson of Goodness of Ft Statstcs for Lnear Regresson, Part II The authors contnue ther dscusson of the correlaton coeffcent n developng a calbraton for quanttatve analyss. Jerome Workman Jr. and Howard

More information

DERIVATION OF THE PROBABILITY PLOT CORRELATION COEFFICIENT TEST STATISTICS FOR THE GENERALIZED LOGISTIC DISTRIBUTION

DERIVATION OF THE PROBABILITY PLOT CORRELATION COEFFICIENT TEST STATISTICS FOR THE GENERALIZED LOGISTIC DISTRIBUTION Internatonal Worshop ADVANCES IN STATISTICAL HYDROLOGY May 3-5, Taormna, Italy DERIVATION OF THE PROBABILITY PLOT CORRELATION COEFFICIENT TEST STATISTICS FOR THE GENERALIZED LOGISTIC DISTRIBUTION by Sooyoung

More information

Testing for seasonal unit roots in heterogeneous panels

Testing for seasonal unit roots in heterogeneous panels Testng for seasonal unt roots n heterogeneous panels Jesus Otero * Facultad de Economía Unversdad del Rosaro, Colomba Jeremy Smth Department of Economcs Unversty of arwck Monca Gulett Aston Busness School

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

Some modelling aspects for the Matlab implementation of MMA

Some modelling aspects for the Matlab implementation of MMA Some modellng aspects for the Matlab mplementaton of MMA Krster Svanberg krlle@math.kth.se Optmzaton and Systems Theory Department of Mathematcs KTH, SE 10044 Stockholm September 2004 1. Consdered optmzaton

More information

Introduction to Regression

Introduction to Regression Introducton to Regresson Dr Tom Ilvento Department of Food and Resource Economcs Overvew The last part of the course wll focus on Regresson Analyss Ths s one of the more powerful statstcal technques Provdes

More information

Using Multivariate Rank Sum Tests to Evaluate Effectiveness of Computer Applications in Teaching Business Statistics

Using Multivariate Rank Sum Tests to Evaluate Effectiveness of Computer Applications in Teaching Business Statistics Usng Multvarate Rank Sum Tests to Evaluate Effectveness of Computer Applcatons n Teachng Busness Statstcs by Yeong-Tzay Su, Professor Department of Mathematcs Kaohsung Normal Unversty Kaohsung, TAIWAN

More information

Short Term Load Forecasting using an Artificial Neural Network

Short Term Load Forecasting using an Artificial Neural Network Short Term Load Forecastng usng an Artfcal Neural Network D. Kown 1, M. Km 1, C. Hong 1,, S. Cho 2 1 Department of Computer Scence, Sangmyung Unversty, Seoul, Korea 2 Department of Energy Grd, Sangmyung

More information

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

Pulse Coded Modulation

Pulse Coded Modulation Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal

More information

Chapter 3 Describing Data Using Numerical Measures

Chapter 3 Describing Data Using Numerical Measures Chapter 3 Student Lecture Notes 3-1 Chapter 3 Descrbng Data Usng Numercal Measures Fall 2006 Fundamentals of Busness Statstcs 1 Chapter Goals To establsh the usefulness of summary measures of data. The

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

III. Econometric Methodology Regression Analysis

III. Econometric Methodology Regression Analysis Page Econ07 Appled Econometrcs Topc : An Overvew of Regresson Analyss (Studenmund, Chapter ) I. The Nature and Scope of Econometrcs. Lot s of defntons of econometrcs. Nobel Prze Commttee Paul Samuelson,

More information

/ n ) are compared. The logic is: if the two

/ n ) are compared. The logic is: if the two STAT C141, Sprng 2005 Lecture 13 Two sample tests One sample tests: examples of goodness of ft tests, where we are testng whether our data supports predctons. Two sample tests: called as tests of ndependence

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Neural Networks & Learning

Neural Networks & Learning Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred

More information

Statistics for Business and Economics

Statistics for Business and Economics Statstcs for Busness and Economcs Chapter 11 Smple Regresson Copyrght 010 Pearson Educaton, Inc. Publshng as Prentce Hall Ch. 11-1 11.1 Overvew of Lnear Models n An equaton can be ft to show the best lnear

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB Journal of Envronmental Protecton, 01, 3, 689-693 http://dxdoorg/10436/jep0137081 Publshed Onlne July 01 (http://wwwscrporg/journal/jep) 689 Atmospherc Envronmental Qualty Assessment RBF Model Based on

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

Topic 23 - Randomized Complete Block Designs (RCBD)

Topic 23 - Randomized Complete Block Designs (RCBD) Topc 3 ANOVA (III) 3-1 Topc 3 - Randomzed Complete Block Desgns (RCBD) Defn: A Randomzed Complete Block Desgn s a varant of the completely randomzed desgn (CRD) that we recently learned. In ths desgn,

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

Non-linear Canonical Correlation Analysis Using a RBF Network

Non-linear Canonical Correlation Analysis Using a RBF Network ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane

More information

Real-Time Systems. Multiprocessor scheduling. Multiprocessor scheduling. Multiprocessor scheduling

Real-Time Systems. Multiprocessor scheduling. Multiprocessor scheduling. Multiprocessor scheduling Real-Tme Systems Multprocessor schedulng Specfcaton Implementaton Verfcaton Multprocessor schedulng -- -- Global schedulng How are tasks assgned to processors? Statc assgnment The processor(s) used for

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

Midterm Examination. Regression and Forecasting Models

Midterm Examination. Regression and Forecasting Models IOMS Department Regresson and Forecastng Models Professor Wllam Greene Phone: 22.998.0876 Offce: KMC 7-90 Home page: people.stern.nyu.edu/wgreene Emal: wgreene@stern.nyu.edu Course web page: people.stern.nyu.edu/wgreene/regresson/outlne.htm

More information

MA 323 Geometric Modelling Course Notes: Day 13 Bezier Curves & Bernstein Polynomials

MA 323 Geometric Modelling Course Notes: Day 13 Bezier Curves & Bernstein Polynomials MA 323 Geometrc Modellng Course Notes: Day 13 Bezer Curves & Bernsten Polynomals Davd L. Fnn Over the past few days, we have looked at de Casteljau s algorthm for generatng a polynomal curve, and we have

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

Prof. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model

Prof. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model EXACT OE-DIMESIOAL ISIG MODEL The one-dmensonal Isng model conssts of a chan of spns, each spn nteractng only wth ts two nearest neghbors. The smple Isng problem n one dmenson can be solved drectly n several

More information

x i1 =1 for all i (the constant ).

x i1 =1 for all i (the constant ). Chapter 5 The Multple Regresson Model Consder an economc model where the dependent varable s a functon of K explanatory varables. The economc model has the form: y = f ( x,x,..., ) xk Approxmate ths by

More information

CHAPTER IV RESEARCH FINDING AND ANALYSIS

CHAPTER IV RESEARCH FINDING AND ANALYSIS CHAPTER IV REEARCH FINDING AND ANALYI A. Descrpton of Research Fndngs To fnd out the dfference between the students who were taught by usng Mme Game and the students who were not taught by usng Mme Game

More information

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands Content. Inference on Regresson Parameters a. Fndng Mean, s.d and covarance amongst estmates.. Confdence Intervals and Workng Hotellng Bands 3. Cochran s Theorem 4. General Lnear Testng 5. Measures of

More information

Statistics for Managers Using Microsoft Excel/SPSS Chapter 13 The Simple Linear Regression Model and Correlation

Statistics for Managers Using Microsoft Excel/SPSS Chapter 13 The Simple Linear Regression Model and Correlation Statstcs for Managers Usng Mcrosoft Excel/SPSS Chapter 13 The Smple Lnear Regresson Model and Correlaton 1999 Prentce-Hall, Inc. Chap. 13-1 Chapter Topcs Types of Regresson Models Determnng the Smple Lnear

More information

An (almost) unbiased estimator for the S-Gini index

An (almost) unbiased estimator for the S-Gini index An (almost unbased estmator for the S-Gn ndex Thomas Demuynck February 25, 2009 Abstract Ths note provdes an unbased estmator for the absolute S-Gn and an almost unbased estmator for the relatve S-Gn for

More information

Department of Statistics University of Toronto STA305H1S / 1004 HS Design and Analysis of Experiments Term Test - Winter Solution

Department of Statistics University of Toronto STA305H1S / 1004 HS Design and Analysis of Experiments Term Test - Winter Solution Department of Statstcs Unversty of Toronto STA35HS / HS Desgn and Analyss of Experments Term Test - Wnter - Soluton February, Last Name: Frst Name: Student Number: Instructons: Tme: hours. Ads: a non-programmable

More information

Estimating the Fundamental Matrix by Transforming Image Points in Projective Space 1

Estimating the Fundamental Matrix by Transforming Image Points in Projective Space 1 Estmatng the Fundamental Matrx by Transformng Image Ponts n Projectve Space 1 Zhengyou Zhang and Charles Loop Mcrosoft Research, One Mcrosoft Way, Redmond, WA 98052, USA E-mal: fzhang,cloopg@mcrosoft.com

More information

Uncertainty and auto-correlation in. Measurement

Uncertainty and auto-correlation in. Measurement Uncertanty and auto-correlaton n arxv:1707.03276v2 [physcs.data-an] 30 Dec 2017 Measurement Markus Schebl Federal Offce of Metrology and Surveyng (BEV), 1160 Venna, Austra E-mal: markus.schebl@bev.gv.at

More information

Statistics for Managers Using Microsoft Excel/SPSS Chapter 14 Multiple Regression Models

Statistics for Managers Using Microsoft Excel/SPSS Chapter 14 Multiple Regression Models Statstcs for Managers Usng Mcrosoft Excel/SPSS Chapter 14 Multple Regresson Models 1999 Prentce-Hall, Inc. Chap. 14-1 Chapter Topcs The Multple Regresson Model Contrbuton of Indvdual Independent Varables

More information

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN S. Chtwong, S. Wtthayapradt, S. Intajag, and F. Cheevasuvt Faculty of Engneerng, Kng Mongkut s Insttute of Technology

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

Methods of Detecting Outliers in A Regression Analysis Model.

Methods of Detecting Outliers in A Regression Analysis Model. Methods of Detectng Outlers n A Regresson Analyss Model. Ogu, A. I. *, Inyama, S. C+, Achugamonu, P. C++ *Department of Statstcs, Imo State Unversty,Owerr +Department of Mathematcs, Federal Unversty of

More information

Developing a Data Validation Tool Based on Mendelian Sampling Deviations

Developing a Data Validation Tool Based on Mendelian Sampling Deviations Developng a Data Valdaton Tool Based on Mendelan Samplng Devatons Flppo Bscarn, Stefano Bffan and Fabola Canaves A.N.A.F.I. Italan Holsten Breeders Assocaton Va Bergamo, 292 Cremona, ITALY Abstract A more

More information

Time-Varying Systems and Computations Lecture 6

Time-Varying Systems and Computations Lecture 6 Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy

More information

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS IJRRAS 8 (3 September 011 www.arpapress.com/volumes/vol8issue3/ijrras_8_3_08.pdf NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS H.O. Bakodah Dept. of Mathematc

More information