Forecasting Time Series with Multiple Seasonal Cycles using Neural Networks with Local Learning

Size: px
Start display at page:

Download "Forecasting Time Series with Multiple Seasonal Cycles using Neural Networks with Local Learning"

Transcription

1 Forecasng Tme Seres wh Mulple Seasonal Cycles usng Neural Neworks wh Local Learnng Grzegorz Dudek Deparmen of Elecrcal Engneerng, Czesochowa Unversy of Technology, Al. Arm Krajowej 7, Czesochowa, Poland Absrac. In he arcle a smple neural model wh local learnng for forecasng me seres wh mulple seasonal cycles s presened. Ths model uses paerns of he me seres seasonal cycles: npu ones represenng cycles precedng he forecas momen and forecas ones represenng he forecased cycles. Paerns smplfy he forecasng problem especally when a me seres exhbs nonsaonary, heeroscedascy, rend and many seasonal cycles. The arfcal neural nework learns usng he ranng sample seleced from he neghborhood of he query paern. As a resul he arge funcon s approxmaed locally whch leads o a reducon n problem complexy and enables he use of smpler models. The effecveness of he proposed approach s llusraed hrough applcaons o elecrcal load forecasng and compared wh ARIMA and exponenal smoohng approaches. In a day ahead load forecasng smulaons ndcae he bes resuls for he one-neuron nework. Keywords: seasonal me seres forecasng, shor-erm load forecasng, local learnng, neural neworks. Inroducon Tme seres may conan four dfferen componens: rend, seasonal varaons, cyclcal varaons, and rregular componen. Seasonaly s defned o be he endency of me seres daa o exhb some paern ha repeas perodcally wh varaon. Somemes a me seres conans mulple seasonal cycles of dfferen lenghs. Fg. shows such a me seres, where we can observe annual, weekly and daly varaons. Ths seres represens hourly elecrcal load of he Polsh power sysem. From hs fgure can be seen ha he daly and weekly profles change durng he year. In summer hey are more fla han n wner. The daly profle depends on he day of he week as well. The profles of he weekdays are smlar o each oher n he same perod of he year. To he characersc feaures of hs me seres s nonsaonary and heeroscedascy should be ncluded as well. These all feaures have o be capured by he flexble forecasng model. The mos commonly employed mehods o modelng seasonal me seres nclude []: seasonal auoregressve negraed movng average model (ARIMA), exponenal adfa, p., 20. Sprnger-Verlag Berln Hedelberg 20

2 smoohng (ES), arfcal neural neworks (ANNs), dynamc harmonc regresson, vecor auoregresson, random effec models, and many ohers. (a) 20 L, GW Year (b) wner sprng summer auumn L, GW Mon Tue Wed Thu Fr Sa Sun Hour Fg.. The load me seres of he Polsh power sysem n hree-year (a) and one-week (b) nervals. The base ARIMA model wh jus one seasonal paern can be exended for he case of mulple seasonales. An example of such an exenson was presened n [2]. A combnaoral problem of selecng approprae model orders s an nconvenence n he me seres modelng usng mulple seasonal ARIMA. Anoher dsadvanage s he lnear characer of he ARIMA model. Anoher popular model he Hol-Wners exponenal smoohng was adaped by Taylor so ha can accommodae wo and more seasonales [2]. An advanage of he ES models s ha hey can be nonlnear. On he oher hand can be vewed as beng of hgh dmenson, as nvolves nalzaon and updang of a large number of erms (level, perods of he nraday and nraweek cycles). In [] more parsmonous formulaon of ES s proposed. New exponenally weghed mehods for forecasng me seres ha conss of boh nraweek and nraday seasonal cycles can be found n [3]. Gould e al. [4] nroduced he nnovaon sae space models ha underle ES mehods for boh addve and mulplcave seasonaly. Ths procedure provdes a heorecal foundaon for ES mehods and mproves on he curren approaches by provdng a common sense srucure o he models, flexbly n modelng seasonal

3 paerns, a poenal reducon n he number of parameers o be esmaed, and model based predcon nervals. ANNs beng nonlnear and daa-drven n naure, may be well sued o he seasonal me seres modelng. They can exrac unknown and general nformaon from mul-dmensonal daa usng her self-learnng ably. Ths feaure releases a desgner from a dffcul ask of a pror model selecon. Bu new problems appear: he selecon of nework archecure as well as he learnng algorhm. From many ypes of ANN mos ofen n forecasng asks he mullayer percepron s used, whch has a propery of unversal approxmaon. ANNs are able o deal wh he seasonal me seres whou he pror seasonal adjusmen bu deseasonalzaon and also derendng s recommended [5]. The me seres decomposon s used no only n ANNs, bu also n oher models, e.g. ARIMA and ES. The componens showng less complexy han he orgnal me seres can be modeled ndependenly and more accurae. Usually he me seres s decomposed on seasonal, rend and sochasc componens. Oher mehods of decomposons apply he Fourer or wavele ransform. The smple way o remove seasonaly s o defne he separae me seres for each observaon n a cycle,.e. n he case of cycle of lengh n, n me seres s defned ncludng observaons n he same poson n successve cycles. Ths paper consders smple neural forecasng model ha approxmaes he arge funcon usng paerns of seasonal cycles. Defnng paerns we do no need o decompose a me seres. A rend and many seasonal cycles as well as he nonsaonary and heeroscedascy s no a problem here when usng proper paern defnons. The proposed neural model learns n a local learnng procedure whch allows o model he arge funcon n he neghborhood of he query paern. As a resul we ge a local model whch s beer fed n hs neghborhood. 2 Paerns of he Tme Seres Seasonal Cycles Our goal s o forecas he me seres elemens n a perod of one seasonal cycle of he shores lengh. In he case of he me seres shown n Fg. hs s a daly cycle conanng n = 24 elemens (hourly loads). The me seres s dvded no sequences conanng one seasonal cycle of lengh n. In order o elmnae rend and seasonal varaons of perods longer han n (weekly and annual varaons n our example), he sequence elemens are preprocessed o oban her paerns. The paern s a vecor wh componens ha are funcons of acual me seres elemens. The npu and oupu (forecas) paerns are defned: x = [x x 2 x n ] T and y = [y y 2 y n ] T, respecvely. The paerns are pared (x, y ), where y s a paern of he me seres sequence succeedng he sequence represened by x. The nerval beween hese sequences s equal o he forecas horzon τ. The way of how he x and y paerns are defned depends on he me seres naure (seasonal varaons, rend), he forecas perod and he forecas horzon. Funcons ransformng seres elemens no paerns should be defned so ha paerns carry mos nformaon abou he process. Moreover, funcons ransformng forecas se-

4 quences no paerns y should ensure he oppose ransformaon: from he forecased paern y o he forecased me seres sequence. The forecas paern y = [y, y,2 y,n ] encodes he successve acual me seres elemens z n he forecas perod +τ: z +τ = [z +τ, z +τ,2 z +τ,n ], and he correspondng npu paern x = [x, x,2 x,n ] maps he me seres elemens n he perod precedng he forecas perod: z = [z, z,2 z,n ]. Vecors y are encoded usng curren process parameers from he neares pas, whch allows o ake no consderaon curren varably of he process and ensures possbly of decodng. Some defnons of he funcons mappng he orgnal space Z no he paern spaces X and Y,.e. f x : Z X and f y : Z Y are presened n [6]. The mos popular defnons are of he form: f ( z x, ) = n z l=, ( z z, l z ) 2, f ( z y, ) = n z l= + τ, ( z z, l z ) 2, () where: =, 2,, N he perod number, =, 2,, n he me seres elemen number n he perod, τ he forecas horzon, z, he h me seres elemen n he perod, z he mean value of elemens n perod. The funcon f x defned usng () expresses normalzaon of he vecors z. Afer normalzaon hese vecors have he uny lengh, zero mean and he same varance. When we use he sandard devaon of he vecor z componens n he denomnaor of equaon (), we receve vecor x wh he uny varance and zero mean. Noe ha he nonsaonary and heeroscedasc me seres s represened by paerns havng he same mean and varance. Forecas paern y s defned usng analogous funcons o npu paern funcon f x, bu s encoded usng he me seres characersc ( z ) deermned from he process hsory, wha enables decodng of he forecased vecor z +τ afer he forecas of paern y s deermned. To calculae he forecased me seres elemen values on he bass of her paerns we use he nverse funcon f ). y ( y, 3 Local learnng The ranng daa can have dfferen properes n dfferen regons of he npu and oupu spaces hus s reasonable o model hs daa locally. The local learnng [7] concerns he opmzaon of he learnng sysem on a subse of he ranng sample, whch conans pons from he neghborhood around he curren query pon x*. By he neghborhood of x* n he smples case we mean he se of s k neares neghbors. A resul of he local learnng s ha he model accuraely adjuss o he arge funcon n he neghborhood of x* bu shows weaker fng ousde hs neghborhood. Thus we ge model whch s locally compeen bu s global generalzaon propery s weak. Modelng he arge funcon n dfferen regons of he space requres relearnng of he model or even o consruc dfferen model, e.g. we can use a lnear

5 model for lnear fragmens of he arge funcon whle for he nonlnear fragmens we can use a nonlnear model. The generalzaon can be acheved by usng a se of local models ha are compeen for dfferen regons of he npu space. Usually hese models are learned when a new query pon s presened. The error creron mnmzed n local learnng algorhm can be defned as follows: N E( x*) = K ( d ( x, x*), h) δ ( y, f ( x )), (2) = where: N number of ranng paerns, K(d(x,x*),h) kernel funcon wh bandwdh h, d(x,x*) dsance beween he query paern x* and ranng paern x, δ(y,f(x )) error beween he model response f(x ) and he arge response y when npu paern x s presened (hs response can be a scalar value). Varous kernel funcons mgh be used, ncludng unform kernels and Gaussan kernels whch are ones of he mos popular. The kernel s cenered on he query pon x* and he bandwdh h deermnes he wegh of he h ranng paern error n (2). When we use unform kernel he ranng paerns for whch d(x,x*) h = d(x k,x*), where x k s he kh neares neghbor of x*, have uny weghs. More dsan paerns have zero weghs, and herefore here s no need o use hese pons n he learnng process. For Gaussan kernels all ranng pons have nonzero weghs calculaed from he formula exp( d 2 (x,x*)/(2h 2 )), whch means ha her weghs decrease monooncally wh he dsance from x* and wh he speed dependen on h. In order o reduce he compuaonal cos of deermnaon of errors and weghs for all ranng pons we can combne boh kernels and calculae weghs accordng o he Gaussan kernel for only k neares neghbors of x*. The compuaonal cos s now ndependen of he oal number of ranng paerns, bu only on he number of consdered neghbors k. In he expermenal par of hs paper we use local learnng procedure wh unform kernel. 4 Expermenal Resuls As an llusrave example of forecasng me seres wh mulple seasonal cycles usng neural neworks wh local learnng we sudy he shor-erm elecrcal load forecasng problem. Shor-erm load forecasng plays a key role n conrol and schedulng of power sysems and s exremely mporan for energy supplers, sysem operaors, fnancal nsuons, and oher parcpans n elecrc energy generaon, ransmsson, dsrbuon, and markes. In he frs expermens we use he me seres of he hourly elecrcal load of he Polsh power sysem from he perod Ths seres s shown n Fg.. The me seres were dvded no ranng and es pars. The es se conaned 3 pars of paerns from July The ranng par Ψ conaned paerns from he perod from January 2002 o he day precedng he day of forecas.

6 We defne he forecasng asks as forecasng he power sysem load a hour =, 2,, 24 of he day j =, 2,, 3, where j s he day number n he es se. So we ge 744 forecasng asks. In local learnng approach for each ask he separae ANNs were creaed and learned. The ranng se for each forecasng ask s prepared as follows: frs we prepare he se Ω = {(x, y, )}, where ndcaes pars of paerns from Ψ represenng days of he same ype (Monday,, Sunday) as days represened by a query par (x*, y *), hen based on he Eucldean dsances d(x, x*) we selec from Ω k neares neghbors of he query par geng he ranng se Φ = {(x, y, )} Ω Ψ. For example when he forecasng ask s o forecas he sysem load a hour on Sunday, model learns on k neares neghbors of he query paern whch are seleced from x-paerns represenng he Saurday paerns and h componens of y-paerns represenng he Sunday paerns. ANN (he mullayer percepron) learns he mappng of he npu paerns o he componens of oupu paerns: f : X Y. Number of ANN npus s equal o he x- paern componens. To preven overfng ANN s learned usng Levenberg- Marquard algorhm wh Bayesan regularzaon [7], whch mnmzes a combnaon of squared errors and ne weghs. The resulng nework has good generalzaon quales. In he frs expermen we assume k = 2. Snce he arge funcon f s modeled locally, usng a small number of learnng pons, raher a smple form of hs funcon should be expeced, whch mples small number of neurons. We esed he neworks: composed of only one neuron wh lnear or bpolar sgmodal acvaon funcon, wh one hdden layer conssng of m = 2,..., 8 neurons wh sgmodal acvaon funcons and one oupu neuron wh lnear acvaon funcon. Such a nework archecure can be seen as a unversal approxmaor. APE and MAPE (absolue percenage error and mean APE) s adoped here o assess he performance of he forecasng models. The resuls (MAPE for he ranng and es samples and he nerquarle range (IQR) of MAPE s ) of he 9 varans of ANNs are presened n ab.. Tes errors for hese varans are sascally ndsngushable (o check hs we use he Wlcoxon rank sum es for equaly of APE medans; α = 0,05). I s observed ha for he wo-layered neworks n many cases mos weghs ends o zero (weghs decay s a resul of regularzaon), hus some neurons can be elmnaed. As an opmal ANN archecure ha one wh one neuron wh sgmodal acvaon funcon s chosen. Ths one-neuron ANN s used n he nex expermens. In he second expermen we examne he nework performance dependng on he number of he neares neghbors k,.e. he sze of he ranng se Φ. We change k from 2 o 50. The resuls are shown n Fg. 2, where MAPE for he cases when he ANN s raned usng all ranng pons represenng days of he same ype as days represened by query par,.e. pons from he se Ω, s also shown. As we can see

7 from hs fgure he es error remans around when k [6, 50]. For hese cases MAPE s are sascally ndsngushable when usng Wlcoxon es. When we ran ANN usng paerns from he se Ω MAPE s s sascally dsngushable greaer han for k [6, 50]. Table. Resuls of 9 varans of ANNs. Number of neurons ln sg MAPE rn MAPE s IQR s MAPE for rn Ω.25 MAPE s for Ω MAPE Number of neares neghbors k Fg. 2. MAPE for he ranng ses (rngs) and es se (crosses) dependng on k. In he local learnng approach he horny ssue s he rao of he ranng pons number o he number of free parameers of he nework. Ths rao for our example even for one-neuron ANN s oo small (2/25), whch means ha he model s overszed ( has oo many degrees of freedom n relaon o he problem complexy expressed by only a few ranng pons). The regularzaon whch has a form of a penaly for complexy s a good dea o solve hs problem. Anoher dea s he feaure selecon or feaure exracon as a form of dmensonaly reducon. The mos popular mehod of feaure exracon s he prncpal componen analyss (PCA). Ths procedure uses an orhogonal ransformaon o conver a se of muldmensonal vecors of possbly correlaed componens no a se of vecors of lnearly uncorrelaed componens called prncpal componens. The number of prncpal componens s less han or equal o he dmenson of orgnal vecors. In he nex expermen we ransform he 24-dmensonal x-paerns no paerns wh a smaller number of uncorrelaed componens usng PCA. Fg. 3 shows relaonshp beween MAPE and he number of prncpal componens. From hs fgure can be seen ha he levels of errors are very smlar. MAPE s are sascally ndsngushable for dfferen number of prncpal componens. Usng only frs prncpal componen we can bul good neural forecasng model for our daa. Such a model has only wo parameers. The percen var-

8 ance explaned by he correspondng prncpal componens are shown n Fg. 4. The frs prncpal componen explans 30% of he oal varance MAPE Number of prncpal componens Fg. 3. MAPE for he ranng ses (rngs) and es se (crosses) dependng on he number of prncpal componens. 00 Varance explaned (%) Prncpal componen Fg. 4. The percen varance explaned by he correspondng prncpal componens. Now we compare he proposed one-neuron ANN wh oher popular models of he seasonal me seres forecasng: ARIMA and ES. These models were esed n he nex day elecrcal load curve forecasng problem on hree me seres of elecrcal load: PL: me seres of he hourly load of he Polsh power sysem from he perod (hs me seres was used n he expermens descrbed above). The es sample ncludes daa from 2004 wh he excepon of 3 unypcal days (e.g. holdays), FR: me seres of he half-hourly load of he French power sysem from he perod The es sample ncludes daa from 2009 excep for 2 unypcal days, GB: me seres of he half-hourly load of he Brsh power sysem from he perod The es sample ncludes daa from 2009 excep for 8 unypcal days.

9 In ARIMA he me seres were decomposed no n seres,.e. for each a separae seres was creaed. In hs way a daly seasonaly was removed. For he ndependen modelng of hese seres ARIMA(p, d, q) (P, D, Q) m model was used: m m D d m ) φ ( B)( B ) ( B) z = c + ( B ) θ ( B) ξ Φ ( B Θ, (3) where {z } s he me seres, {ξ } s a whe nose process wh mean zero and varance σ 2, B s he backshf operaor, Φ(.), φ(.), Θ(.), and θ(.) are polynomals of order P, p, Q and q, respecvely, m s he seasonal perod (for our daa m = 7), d and D are orders of nonseasonal and seasonal dfferencng, respecvelly, and c s a consan. To fnd he bes ARIMA model for each me seres we use a sep-wse procedure for raversng he model space whch s mplemened n he forecas package for he R sysem for sascal compung [8]. Ths auomac procedure reurns he model wh he lowes Akake's Informaon Creron (AIC) value. ARIMA model parameers, as well as he parameers of he ES model descrbed below, were esmaed usng 2-week me seres fragmens mmedaely precedng he forecased daly perod. Unypcal days n hese fragmens were replaced wh he days from he prevous weeks. The ES sae space models [9] are classfed no 30 ypes dependng on how he seasonal, rend and error componens are aken no accoun. These componens can be expressed addvely or mulplcavely, and he rend can be damped or no. For example, he ES model wh a dumped addve rend, mulplcave seasonaly and mulplcave errors s of he form: Level : Growh : Seasonal : Forecas : l = ( l s = s b = φb µ = ( l m + φb + β ( l ( + γξ ), + φb )( + αξ ), ) s + φb m, ) ξ, (4) where l represens he level of he seres a me, b denoes he growh (or slope) a me, s s he seasonal componen of he seres a me, µ s he expeced value of he forecas a me, α, β, γ (0, ) are he smoohng parameers, and φ (0, ) denoes a dampng parameer. In model (4) here s only one seasonal componen. For hs reason, as n he case of he ARIMA model, me seres s decomposed no n seres, each of whch represens he load a he same me of a day. These seres were modeled ndependenly usng an auomaed procedure mplemened n he forecas package for he R sysem [8]. In hs procedure he nal saes of he level, growh and seasonal componens are esmaed as well as he smoohng and dampng parameers. AIC was used for selecng he bes model for a gven me seres. In Table 2 resuls of PL, FR and GB me seres forecass are presened. In hs able he resuls of forecas deermned by he naïve mehod are also shown. The forecas rule n hs case s as follows: he forecased daly cycle s he same as seven days ago. The Wlcoxon es ndcaes sascally sgnfcan dfferences beween MAPE s

10 for each par of models and each me seres, so we can ndcae he one-neuron ANN as he bes model for hs daa and ES as he second bes model. Table 2. Resuls of forecasng. Model PL FR GB MAPE s IQR MAPE s IQR MAPE s IQR ANN ARIMA ES Naïve The las expermen concerns me seres forecasng up o seven daly perods ahead. In such asks he y-paerns are defned usng τ =, 2,, 7. For each horzon τ he one-neuron ANN s raned usng he same local learnng scheme as for τ = descrbed above. The forecas errors for PL, FR and GB me seres n Fg. 5 are presened. For FR and GB daa ANN gave he lowes errors. For PL daa and τ > 2 ES model s beer, and for τ > 3 also ARIMA model s beer. The acual and forecased fragmens of he me seres are shown n Fg. 6. MAPE s PL ARIMA ES ANN Forecas horzon MAPE s FR Forecas horzon MAPE s GB Forecas horzon Fg. 5. The forecas errors for dfferen horzons. Noe ha n he case of ARIMA and ES he model parameers are esmaed on he bass of he me seres fragmen (2 weeks n our example) drecly precedng he forecased fragmen. ANN learns on he ranng se composed of paerns represened daly perods from longer hsory. In local learnng case he ranng paerns are seleced usng creron based on he smlary o he curren npu paern. 5 Conclusons In hs arcle we examne a smple neural model wh local learnng for forecasng seasonal me seres. A he nal sage of he forecasng procedure daa are preprocessed o ge paerns of he me seres seasonal perods. An approach based on he paerns of he seasonal cycles smplfy he problem of forecasng of he nonsaonary and heeroscedasc me seres wh rend and many seasonal vara-

11 ons. Afer smplfcaon he problem can be modeled usng smpler ools. The exsence of many seasonal cycles s no a problem when we use forecasng model based on paerns. We resgn from he global modelng, whch does no necessarly brngs good resuls for he curren query pon. Insead, we approxmae he arge funcon locally n he neghborhood of he query pon. The dsadvanage of he local learnng s he need o learn he model for each query pon. Snce he local complexy s lower han he global one, we can use a smple model ha s quckly learned PL, ARIMA ES ANN L a L, GW 4 3 L, GW 2 τ= τ=2 τ=3 τ=4 τ =5 τ=6 τ= , h FR, τ = τ =2 τ =3 τ =4 τ =5 τ =6 τ = , h GB, L, GW τ= τ=2 τ=3 τ=4 τ=5 τ=6 τ = , h Fg. 6. The fragmens of load me seres and her forecass for dfferen horzons.

12 Ths approach s accepable when we have enough me (some seconds) o learn model and prepare forecas. The learnng speed s penalzed by he selecon of he neares neghbors. As shown by smulaon sudes o model he local relaonshp beween npu and oupu paerns he one-neuron model s suffcen. Ths model urned ou o be beer han he convenonal models (ARIMA and exponenal smoohng) n one-day ahead forecasng of he elecrcal load me seres and compeve n forecasng over longer me horzons. Acknowledgmens. The auhor would lke o hank Professor James W. Taylor from he Saïd Busness School, Unversy of Oxford for provdng French and Brsh load daa. The sudy was suppored by he Research Projec N N fnanced by he Polsh Mnsry of Scence and Hgher Educaon. References. Taylor, J.W., Snyder, R.D.: Forecasng Inraday Tme Seres wh Mulple Seasonal Cycles Usng Parsmonous Seasonal Exponenal Smoohng. Deparmen of Economercs and Busness Sascs Workng Paper 9/09, Monash Unversy (2009) 2. Taylor, J.W.: Shor-Term Elecrcy Demand Forecasng Usng Double Seasonal Exponenal Smoohng. Journal of he Operaonal Research Socey 54, (2003) 3. Taylor, J.W.: Exponenally Weghed Mehods for Forecasng Inraday Tme Seres wh Mulple Seasonal Cycles. Inernaonal Journal of Forecasng 26(4), (200) 4. Gould, P.G., Koehler, A.B., Ord, J.K., Snyder, R.D., Hyndman, R.J., Vahd-Aragh, F.: Forecasng Tme-Seres wh Mulple Seasonal Paerns. European Journal of Operaonal Research 9, (2008) 5. Zhang, G.P., Q, M.: Neural Nework Forecasng for Seasonal and Trend Tme Seres. European Journal of Operaonal Research 60, (2005) 6. Dudek, G.: Smlary-based Approaches o Shor-Term Load Forecasng. In: Zhu, J.J., Fung, G.P.C. (eds.): Forecasng Models: Mehods and Applcaons, pp Concep Press (200) hp:// paperid= Foresee F.D., Hagan M.T.: Gauss-Newon Approxmaon o Bayesan Regularzaon. Proc. 997 Inernaonal Jon Conference on Neural Neworks, (997) 8. Hyndman, R.J., Khandakar, Y.: Auomac Tme Seres Forecasng: The Forecas Package for R. Journal of Sascal Sofware 27(3), 22 (2008) 9. Hyndman, R.J., Koehler, A.B., Ord, J.K., Snyder, R.D.: Forecasng wh Exponenal Smoohng: The Sae Space Approach. Sprnger (2008)

Artificial Immune System for Forecasting Time Series with Multiple Seasonal Cycles

Artificial Immune System for Forecasting Time Series with Multiple Seasonal Cycles Arfcal Immune Sysem for Forecasng Tme Seres wh Mulple Seasonal Cycles Grzegorz Dudek Deparmen of Elecrcal Engneerng, Czesochowa Unversy of Technology, Al. Arm Krajowej 17, 42-200 Czesochowa, Poland dudek@el.pcz.czes.pl

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

TSS = SST + SSE An orthogonal partition of the total SS

TSS = SST + SSE An orthogonal partition of the total SS ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process Neural Neworks-Based Tme Seres Predcon Usng Long and Shor Term Dependence n he Learnng Process J. Puchea, D. Paño and B. Kuchen, Absrac In hs work a feedforward neural neworksbased nonlnear auoregresson

More information

Department of Economics University of Toronto

Department of Economics University of Toronto Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of

More information

RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA

RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA Mchaela Chocholaá Unversy of Economcs Braslava, Slovaka Inroducon (1) one of he characersc feaures of sock reurns

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

Advanced time-series analysis (University of Lund, Economic History Department)

Advanced time-series analysis (University of Lund, Economic History Department) Advanced me-seres analss (Unvers of Lund, Economc Hsor Dearmen) 3 Jan-3 Februar and 6-3 March Lecure 4 Economerc echnues for saonar seres : Unvarae sochasc models wh Box- Jenns mehodolog, smle forecasng

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6) Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen

More information

Machine Learning Linear Regression

Machine Learning Linear Regression Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)

More information

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015)

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015) 5h Inernaonal onference on Advanced Desgn and Manufacurng Engneerng (IADME 5 The Falure Rae Expermenal Sudy of Specal N Machne Tool hunshan He, a, *, La Pan,b and Bng Hu 3,c,,3 ollege of Mechancal and

More information

Analysis And Evaluation of Econometric Time Series Models: Dynamic Transfer Function Approach

Analysis And Evaluation of Econometric Time Series Models: Dynamic Transfer Function Approach 1 Appeared n Proceedng of he 62 h Annual Sesson of he SLAAS (2006) pp 96. Analyss And Evaluaon of Economerc Tme Seres Models: Dynamc Transfer Funcon Approach T.M.J.A.COORAY Deparmen of Mahemacs Unversy

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

Cubic Bezier Homotopy Function for Solving Exponential Equations

Cubic Bezier Homotopy Function for Solving Exponential Equations Penerb Journal of Advanced Research n Compung and Applcaons ISSN (onlne: 46-97 Vol. 4, No.. Pages -8, 6 omoopy Funcon for Solvng Eponenal Equaons S. S. Raml *,,. Mohamad Nor,a, N. S. Saharzan,b and M.

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets Forecasng Usng Frs-Order Dfference of Tme Seres and Baggng of Compeve Assocave Nes Shuch Kurog, Ryohe Koyama, Shnya Tanaka, and Toshhsa Sanuk Absrac Ths arcle descrbes our mehod used for he 2007 Forecasng

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that THEORETICAL AUTOCORRELATIONS Cov( y, y ) E( y E( y))( y E( y)) ρ = = Var( y) E( y E( y)) =,, L ρ = and Cov( y, y ) s ofen denoed by whle Var( y ) f ofen denoed by γ. Noe ha γ = γ and ρ = ρ and because

More information

On computing differential transform of nonlinear non-autonomous functions and its applications

On computing differential transform of nonlinear non-autonomous functions and its applications On compung dfferenal ransform of nonlnear non-auonomous funcons and s applcaons Essam. R. El-Zahar, and Abdelhalm Ebad Deparmen of Mahemacs, Faculy of Scences and Humanes, Prnce Saam Bn Abdulazz Unversy,

More information

Bernoulli process with 282 ky periodicity is detected in the R-N reversals of the earth s magnetic field

Bernoulli process with 282 ky periodicity is detected in the R-N reversals of the earth s magnetic field Submed o: Suden Essay Awards n Magnecs Bernoull process wh 8 ky perodcy s deeced n he R-N reversals of he earh s magnec feld Jozsef Gara Deparmen of Earh Scences Florda Inernaonal Unversy Unversy Park,

More information

Chapter Lagrangian Interpolation

Chapter Lagrangian Interpolation Chaper 5.4 agrangan Inerpolaon Afer readng hs chaper you should be able o:. dere agrangan mehod of nerpolaon. sole problems usng agrangan mehod of nerpolaon and. use agrangan nerpolans o fnd deraes and

More information

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Journal of Appled Mahemacs and Compuaonal Mechancs 3, (), 45-5 HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Sansław Kukla, Urszula Sedlecka Insue of Mahemacs,

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue. Lnear Algebra Lecure # Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons

More information

Panel Data Regression Models

Panel Data Regression Models Panel Daa Regresson Models Wha s Panel Daa? () Mulple dmensoned Dmensons, e.g., cross-secon and me node-o-node (c) Pongsa Pornchawseskul, Faculy of Economcs, Chulalongkorn Unversy (c) Pongsa Pornchawseskul,

More information

Time-interval analysis of β decay. V. Horvat and J. C. Hardy

Time-interval analysis of β decay. V. Horvat and J. C. Hardy Tme-nerval analyss of β decay V. Horva and J. C. Hardy Work on he even analyss of β decay [1] connued and resuled n he developmen of a novel mehod of bea-decay me-nerval analyss ha produces hghly accurae

More information

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue. Mah E-b Lecure #0 Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons are

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm H ( q, p, ) = q p L( q, q, ) H p = q H q = p H = L Equvalen o Lagrangan formalsm Smpler, bu

More information

How about the more general "linear" scalar functions of scalars (i.e., a 1st degree polynomial of the following form with a constant term )?

How about the more general linear scalar functions of scalars (i.e., a 1st degree polynomial of the following form with a constant term )? lmcd Lnear ransformaon of a vecor he deas presened here are que general hey go beyond he radonal mar-vecor ype seen n lnear algebra Furhermore, hey do no deal wh bass and are equally vald for any se of

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm Hqp (,,) = qp Lqq (,,) H p = q H q = p H L = Equvalen o Lagrangan formalsm Smpler, bu wce as

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle

More information

Comb Filters. Comb Filters

Comb Filters. Comb Filters The smple flers dscussed so far are characered eher by a sngle passband and/or a sngle sopband There are applcaons where flers wh mulple passbands and sopbands are requred Thecomb fler s an example of

More information

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer d Model Cvl and Surveyng Soware Dranage Analyss Module Deenon/Reenon Basns Owen Thornon BE (Mech), d Model Programmer owen.hornon@d.com 4 January 007 Revsed: 04 Aprl 007 9 February 008 (8Cp) Ths documen

More information

Chapter 6 DETECTION AND ESTIMATION: Model of digital communication system. Fundamental issues in digital communications are

Chapter 6 DETECTION AND ESTIMATION: Model of digital communication system. Fundamental issues in digital communications are Chaper 6 DEECIO AD EIMAIO: Fundamenal ssues n dgal communcaons are. Deecon and. Esmaon Deecon heory: I deals wh he desgn and evaluaon of decson makng processor ha observes he receved sgnal and guesses

More information

January Examinations 2012

January Examinations 2012 Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

Time Scale Evaluation of Economic Forecasts

Time Scale Evaluation of Economic Forecasts CENTRAL BANK OF CYPRUS EUROSYSTEM WORKING PAPER SERIES Tme Scale Evaluaon of Economc Forecass Anons Mchs February 2014 Worng Paper 2014-01 Cenral Ban of Cyprus Worng Papers presen wor n progress by cenral

More information

Relative controllability of nonlinear systems with delays in control

Relative controllability of nonlinear systems with delays in control Relave conrollably o nonlnear sysems wh delays n conrol Jerzy Klamka Insue o Conrol Engneerng, Slesan Techncal Unversy, 44- Glwce, Poland. phone/ax : 48 32 37227, {jklamka}@a.polsl.glwce.pl Keywor: Conrollably.

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,

More information

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes.

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes. umercal negraon of he dffuson equaon (I) Fne dfference mehod. Spaal screaon. Inernal nodes. R L V For hermal conducon le s dscree he spaal doman no small fne spans, =,,: Balance of parcles for an nernal

More information

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b Inernaonal Indusral Informacs and Compuer Engneerng Conference (IIICEC 05) Arbue educon Algorhm Based on Dscernbly Marx wh Algebrac Mehod GAO Jng,a, Ma Hu, Han Zhdong,b Informaon School, Capal Unversy

More information

Handout # 6 (MEEN 617) Numerical Integration to Find Time Response of SDOF mechanical system Y X (2) and write EOM (1) as two first-order Eqs.

Handout # 6 (MEEN 617) Numerical Integration to Find Time Response of SDOF mechanical system Y X (2) and write EOM (1) as two first-order Eqs. Handou # 6 (MEEN 67) Numercal Inegraon o Fnd Tme Response of SDOF mechancal sysem Sae Space Mehod The EOM for a lnear sysem s M X DX K X F() () X X X X V wh nal condons, a 0 0 ; 0 Defne he followng varables,

More information

Chapter 6: AC Circuits

Chapter 6: AC Circuits Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.

More information

Supervised Learning in Multilayer Networks

Supervised Learning in Multilayer Networks Copyrgh Cambrdge Unversy Press 23. On-screen vewng permed. Prnng no permed. hp://www.cambrdge.org/521642981 You can buy hs book for 3 pounds or $5. See hp://www.nference.phy.cam.ac.uk/mackay/la/ for lnks.

More information

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach A Novel Iron Loss Reducon Technque for Dsrbuon Transformers Based on a Combned Genec Algorhm - Neural Nework Approach Palvos S. Georglaks Nkolaos D. Doulams Anasasos D. Doulams Nkos D. Hazargyrou and Sefanos

More information

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study) Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor

More information

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015 /4/ Learnng Objecves Self Organzaon Map Learnng whou Exaples. Inroducon. MAXNET 3. Cluserng 4. Feaure Map. Self-organzng Feaure Map 6. Concluson 38 Inroducon. Learnng whou exaples. Daa are npu o he syse

More information

MODELING TIME-VARYING TRADING-DAY EFFECTS IN MONTHLY TIME SERIES

MODELING TIME-VARYING TRADING-DAY EFFECTS IN MONTHLY TIME SERIES MODELING TIME-VARYING TRADING-DAY EFFECTS IN MONTHLY TIME SERIES Wllam R. Bell, Census Bureau and Donald E. K. Marn, Howard Unversy and Census Bureau Donald E. K. Marn, Howard Unversy, Washngon DC 0059

More information

Math 128b Project. Jude Yuen

Math 128b Project. Jude Yuen Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally

More information

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION INTERNATIONAL TRADE T. J. KEHOE UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 27 EXAMINATION Please answer wo of he hree quesons. You can consul class noes, workng papers, and arcles whle you are workng on he

More information

Including the ordinary differential of distance with time as velocity makes a system of ordinary differential equations.

Including the ordinary differential of distance with time as velocity makes a system of ordinary differential equations. Soluons o Ordnary Derenal Equaons An ordnary derenal equaon has only one ndependen varable. A sysem o ordnary derenal equaons consss o several derenal equaons each wh he same ndependen varable. An eample

More information

FI 3103 Quantum Physics

FI 3103 Quantum Physics /9/4 FI 33 Quanum Physcs Aleander A. Iskandar Physcs of Magnesm and Phooncs Research Grou Insu Teknolog Bandung Basc Conces n Quanum Physcs Probably and Eecaon Value Hesenberg Uncerany Prncle Wave Funcon

More information

F-Tests and Analysis of Variance (ANOVA) in the Simple Linear Regression Model. 1. Introduction

F-Tests and Analysis of Variance (ANOVA) in the Simple Linear Regression Model. 1. Introduction ECOOMICS 35* -- OTE 9 ECO 35* -- OTE 9 F-Tess and Analyss of Varance (AOVA n he Smple Lnear Regresson Model Inroducon The smple lnear regresson model s gven by he followng populaon regresson equaon, or

More information

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim Korean J. Mah. 19 (2011), No. 3, pp. 263 272 GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS Youngwoo Ahn and Kae Km Absrac. In he paper [1], an explc correspondence beween ceran

More information

Chapter 15. Time Series: Descriptive Analyses, Models, and Forecasting

Chapter 15. Time Series: Descriptive Analyses, Models, and Forecasting Chaper 15 Time Series: Descripive Analyses, Models, and Forecasing Descripive Analysis: Index Numbers Index Number a number ha measures he change in a variable over ime relaive o he value of he variable

More information

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar Anomaly eecon Lecure Noes for Chaper 9 Inroducon o aa Mnng, 2 nd Edon by Tan, Senbach, Karpane, Kumar 2/14/18 Inroducon o aa Mnng, 2nd Edon 1 Anomaly/Ouler eecon Wha are anomales/oulers? The se of daa

More information

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably

More information

( ) [ ] MAP Decision Rule

( ) [ ] MAP Decision Rule Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure

More information

Existence and Uniqueness Results for Random Impulsive Integro-Differential Equation

Existence and Uniqueness Results for Random Impulsive Integro-Differential Equation Global Journal of Pure and Appled Mahemacs. ISSN 973-768 Volume 4, Number 6 (8), pp. 89-87 Research Inda Publcaons hp://www.rpublcaon.com Exsence and Unqueness Resuls for Random Impulsve Inegro-Dfferenal

More information

The Dynamic Programming Models for Inventory Control System with Time-varying Demand

The Dynamic Programming Models for Inventory Control System with Time-varying Demand The Dynamc Programmng Models for Invenory Conrol Sysem wh Tme-varyng Demand Truong Hong Trnh (Correspondng auhor) The Unversy of Danang, Unversy of Economcs, Venam Tel: 84-236-352-5459 E-mal: rnh.h@due.edu.vn

More information

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy Arcle Inernaonal Journal of Modern Mahemacal Scences, 4, (): - Inernaonal Journal of Modern Mahemacal Scences Journal homepage: www.modernscenfcpress.com/journals/jmms.aspx ISSN: 66-86X Florda, USA Approxmae

More information

Polymerization Technology Laboratory Course

Polymerization Technology Laboratory Course Prakkum Polymer Scence/Polymersaonsechnk Versuch Resdence Tme Dsrbuon Polymerzaon Technology Laboraory Course Resdence Tme Dsrbuon of Chemcal Reacors If molecules or elemens of a flud are akng dfferen

More information

Tools for Analysis of Accelerated Life and Degradation Test Data

Tools for Analysis of Accelerated Life and Degradation Test Data Acceleraed Sress Tesng and Relably Tools for Analyss of Acceleraed Lfe and Degradaon Tes Daa Presened by: Reuel Smh Unversy of Maryland College Park smhrc@umd.edu Sepember-5-6 Sepember 28-30 206, Pensacola

More information

A NEW TECHNIQUE FOR SOLVING THE 1-D BURGERS EQUATION

A NEW TECHNIQUE FOR SOLVING THE 1-D BURGERS EQUATION S19 A NEW TECHNIQUE FOR SOLVING THE 1-D BURGERS EQUATION by Xaojun YANG a,b, Yugu YANG a*, Carlo CATTANI c, and Mngzheng ZHU b a Sae Key Laboraory for Geomechancs and Deep Underground Engneerng, Chna Unversy

More information

Short-Term Load Forecasting Using PSO-Based Phase Space Neural Networks

Short-Term Load Forecasting Using PSO-Based Phase Space Neural Networks Proceedngs of he 5h WSEAS In. Conf. on SIMULATION, MODELING AND OPTIMIZATION, Corfu, Greece, Augus 7-9, 005 (pp78-83) Shor-Term Load Forecasng Usng PSO-Based Phase Space Neural Neworks Jang Chuanwen, Fang

More information

M. Y. Adamu Mathematical Sciences Programme, AbubakarTafawaBalewa University, Bauchi, Nigeria

M. Y. Adamu Mathematical Sciences Programme, AbubakarTafawaBalewa University, Bauchi, Nigeria IOSR Journal of Mahemacs (IOSR-JM e-issn: 78-578, p-issn: 9-765X. Volume 0, Issue 4 Ver. IV (Jul-Aug. 04, PP 40-44 Mulple SolonSoluons for a (+-dmensonalhroa-sasuma shallow waer wave equaon UsngPanlevé-Bӓclund

More information

Comparison of Supervised & Unsupervised Learning in βs Estimation between Stocks and the S&P500

Comparison of Supervised & Unsupervised Learning in βs Estimation between Stocks and the S&P500 Comparson of Supervsed & Unsupervsed Learnng n βs Esmaon beween Socks and he S&P500 J. We, Y. Hassd, J. Edery, A. Becker, Sanford Unversy T I. INTRODUCTION HE goal of our proec s o analyze he relaonshps

More information

Bayesian Inference of the GARCH model with Rational Errors

Bayesian Inference of the GARCH model with Rational Errors 0 Inernaonal Conference on Economcs, Busness and Markeng Managemen IPEDR vol.9 (0) (0) IACSIT Press, Sngapore Bayesan Inference of he GARCH model wh Raonal Errors Tesuya Takash + and Tng Tng Chen Hroshma

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 0 Canoncal Transformaons (Chaper 9) Wha We Dd Las Tme Hamlon s Prncple n he Hamlonan formalsm Dervaon was smple δi δ Addonal end-pon consrans pq H( q, p, ) d 0 δ q ( ) δq ( ) δ

More information

P R = P 0. The system is shown on the next figure:

P R = P 0. The system is shown on the next figure: TPG460 Reservor Smulaon 08 page of INTRODUCTION TO RESERVOIR SIMULATION Analycal and numercal soluons of smple one-dmensonal, one-phase flow equaons As an nroducon o reservor smulaon, we wll revew he smples

More information

ABSTRACT KEYWORDS. Bonus-malus systems, frequency component, severity component. 1. INTRODUCTION

ABSTRACT KEYWORDS. Bonus-malus systems, frequency component, severity component. 1. INTRODUCTION EERAIED BU-MAU YTEM ITH A FREQUECY AD A EVERITY CMET A IDIVIDUA BAI I AUTMBIE IURACE* BY RAHIM MAHMUDVAD AD HEI HAAI ABTRACT Frangos and Vronos (2001) proposed an opmal bonus-malus sysems wh a frequency

More information

Sampling Coordination of Business Surveys Conducted by Insee

Sampling Coordination of Business Surveys Conducted by Insee Samplng Coordnaon of Busness Surveys Conduced by Insee Faben Guggemos 1, Olver Sauory 1 1 Insee, Busness Sascs Drecorae 18 boulevard Adolphe Pnard, 75675 Pars cedex 14, France Absrac The mehod presenly

More information

Sampling Procedure of the Sum of two Binary Markov Process Realizations

Sampling Procedure of the Sum of two Binary Markov Process Realizations Samplng Procedure of he Sum of wo Bnary Markov Process Realzaons YURY GORITSKIY Dep. of Mahemacal Modelng of Moscow Power Insue (Techncal Unversy), Moscow, RUSSIA, E-mal: gorsky@yandex.ru VLADIMIR KAZAKOV

More information

CS 268: Packet Scheduling

CS 268: Packet Scheduling Pace Schedulng Decde when and wha pace o send on oupu ln - Usually mplemened a oupu nerface CS 68: Pace Schedulng flow Ion Soca March 9, 004 Classfer flow flow n Buffer managemen Scheduler soca@cs.bereley.edu

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS THE PREICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS INTROUCTION The wo dmensonal paral dfferenal equaons of second order can be used for he smulaon of compeve envronmen n busness The arcle presens he

More information

Boosted LMS-based Piecewise Linear Adaptive Filters

Boosted LMS-based Piecewise Linear Adaptive Filters 016 4h European Sgnal Processng Conference EUSIPCO) Boosed LMS-based Pecewse Lnear Adapve Flers Darush Kar and Iman Marvan Deparmen of Elecrcal and Elecroncs Engneerng Blken Unversy, Ankara, Turkey {kar,

More information

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5 TPG460 Reservor Smulaon 08 page of 5 DISCRETIZATIO OF THE FOW EQUATIOS As we already have seen, fne dfference appromaons of he paral dervaves appearng n he flow equaons may be obaned from Taylor seres

More information

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University Hdden Markov Models Followng a lecure by Andrew W. Moore Carnege Mellon Unversy www.cs.cmu.edu/~awm/uorals A Markov Sysem Has N saes, called s, s 2.. s N s 2 There are dscree meseps, 0,, s s 3 N 3 0 Hdden

More information

The Finite Element Method for the Analysis of Non-Linear and Dynamic Systems

The Finite Element Method for the Analysis of Non-Linear and Dynamic Systems Swss Federal Insue of Page 1 The Fne Elemen Mehod for he Analyss of Non-Lnear and Dynamc Sysems Prof. Dr. Mchael Havbro Faber Dr. Nebojsa Mojslovc Swss Federal Insue of ETH Zurch, Swzerland Mehod of Fne

More information