The Role of Random Spikes and Concurrent Input Layers in Spiking Neural Networks

Size: px
Start display at page:

Download "The Role of Random Spikes and Concurrent Input Layers in Spiking Neural Networks"

Transcription

1 The Role of Random Spkes and Concurren Inpu Layers n Spkng Neural Neworks Jans Zuers Faculy of Compung, Unversy of Lava, Rana bulvars 19, LV-1586 Rga, Lava ans.zuers@lu.lv Absrac. In spkng neural neworks (SNNs), unlke radonal arfcal neural neworks, sgnals are propagaed by a pulse code nsead of a rae code. Ths resuls n ncorporang he me dmenson no he nework and hus heorecally ensures a hgher compuaonal power. The dfferen prncple of operaon makes learnng n SNNs complcaed. In hs paper, a Hebbanlearnng based learnng algorhm, nroduced by he auhor earler, s furher developed hrough applyng random spkes n order o guaranee frequen frngs of neurons for each layer. In addon, he mpac of concurren npu layers o he learnng ably s shown. The nroduced deas have been llusraed wh expermenal resuls. Keywords: Spkng Neural Neworks, Unsupervsed Learnng, Randomness. 1 Inroducon A neural nework s a compuaonal model whch s made up of relavely smple nerconneced processng uns whch are pu no operaon hrough a learnng process. Neural neworks are useful ools for solvng many ypes of problems. These problems can be characersed as mappng (ncludng paern assocaon and paern classfcaon), cluserng, and consran opmsaon. There are several neural nework models whch are avalable for each ype of problem [3]. Typcally, he neurons of a nework communcae va a rae code,.e., by ransmng pure connuous values rrespecve of he mngs of he sgnals. Work wh radonal models s usually arranged n seps durng each sep exacly one paern s classfed or recognzed. Consderng hs, any emporal nformaon, f presen n he problem, or nerconnecedness of paerns s ou of area of responsbly of a neural sysem, so should be processed by a superor sysem. When spkng neural neworks appeared, he neural nework models became ready o be classfed no hree generaons. The frs generaon of arfcal neural neworks conssed of McCulloch-Ps hreshold neurons [9], where a neuron sends a bnary sgnal f he sum of s weghed ncomng sgnals rses above a hreshold value. The second generaon neurons use connuous acvaon funcons nsead of hreshold funcons o compue he oupu sgnals.

2 In conras o classcal models, spkng neural neworks, or he hrd generaon neural neworks, are desgned o make use of he emporal dynamcs of neurons. Here, communcaon beween neurons nvolves a pulse code he exac mng of he ransmed sgnal (.e., he spke) s of key mporance, bu he srengh of he sgnal s no consdered. SNNs are beleved o be very powerful, bu her operaon s more complcaed and her applcaon s no as obvous as s he case wh ordnary neural neworks. Neverheless SNNs have been focused as an area of much research. In [12], he mehod of adapve layer acvaon has been nroduced for Hebbanlke learnng n SNNs. The curren paper furher develops hs approach (a) by addng necons of random spkes o nduce he learnng process, and (b) by exendng nework srucure wh concurren npu layers of neurons n order o enrch presence of emporal emplaes n he npu srucure represened by axonal delays. 2 Spkng Neural Neworks Spkng neural nework s a model of arfcal neural neworks, n whch sgnals are propagaed va spkes nsead of raes (.e., values). SNNs are poenally very powerful, sll complcaed o operae. 2.1 General Descrpon Each neuron n an SNN produces a ran of spkes (.e., fres), and he operaon of he neuron s fully characerzed by he se of frng mes [4] (Fg. 1): F { ( 1), ( n, K ) } =, (1) where (n) s he mos recen spke of neuron. One of he mos frequenly appled spkng neuron models s he negrae-andfre model [1][2]. In hs model, npu sgnals (boh acual and recen) from a neuron are combned ogeher n order o acheve enough acvaon for he neuron o fre. Neuron s sad o fre f s acvaon sae u reaches he hreshold ϑ. A he momen when he hreshold s crossed, a spke wh he frng me (f) s generaed. Thus Eq. (2) can be clarfed wh Eq. (3) [4]: ( f ) { ;1 f n} = { u ( =ϑ} F = ), (2) where he acvaon sae u s gven by he lnear superposon of all conrbuons: u ( ) = η ( ^ ) + Γ ( f ) w F ε ( ( f ) ), (3)

3 where Γ s he se of presynapc neurons o neuron ; ^ s he las spke of neuron ; he funcon η akes care of refracorness afer a spke s emed; he kernels ε model he neurons response o presynapc spkes. s ϑ exp( ); η( s) = τ 0; s> 0 s 0, (4) where τ s a me consan. ax ax s s ax exp( ) exp( ); s > 0 ε ( ) = s τ m τ, s ax 0; s 0 (5) where τ s and τ m are me consans, and ax s he axonal ransmsson delay for presynapc neuron. Axonal ransmsson delay (or smply, axonal delay) s an essenal par of a spkng neuron (Fg. 1). I delays ransmed sgnals for he subsequen neurons, herefore realzes a knd of shor-me memory. Axonal delays are usually nalzed n random wh creaon of an SNN. Eqs. (4) and (5) have been adaped from [4], [8]. In Fg 2, he naure of acvaon of spkng neurons s vsualzed. The algorhm n Fg. 3 gves a general overvew of operaon of a spkng neuron. Incomng spkes Φ n w Processng Weghs Neuron ax Φ ou Axonal delay Oucomng spkes Fg. 1. A sngle spkng neuron. The small flled crcles w denoe synapc weghs. Φ n denoes an ncomng spke ran. The neuron produces he spke ran Φ ou, delayed by axonal delay ax. (Adaped from [7])

4 ϑ (a) w ε ( ( f ) ) Γ ( f ) F ( w ε ( f ) ) ϑ (b) η ( ^ ) u ( ) ^ Fg. 2. Graphc nerpreaon of acvaon sae u (Eq. (3)) a) Acvaon u () s a superposon of responses of ncomng spkes and he weghs w. The kernels ε descrbe he response of u o presynapc spkes. (b) Acvaon u () reaches he hreshold ϑ a me (f). The rese of acvaon s mmedaely performed n order o avod connuaon of spkng over a ceran perod. Ths refracorness s acheved by addng a kernel η ( (f) ). The pure negrae-and-fre model s very smple, bu he fre sequences appear oversmplfed and unable o reproduce rch and rregular naure of spkng of bologcal sysems. In order o avod regulares and deermnsces, numerous exensons of he model, as well as nermedae ones wh oher models have been proposed [6][10]. 2.2 Learnng n Spkng Neural Neworks If supervsed learnng mehods grealy predomnae n non-spkng neural neworks, hen he suaon wh SNNs s que dfferen. Hebban learnng ([5]) s a ypcal learnng mehod wh SNNs. The smples way o descrbe Hebban learnng s ha he wegh modfcaons depend on he level how pre- and possynapc acves mach (Eq. (6)). Operaon of radonal neural neworks s carred ou hrough propagang of values, and he process of propagaon s occurrng whou any consrans he neural acves nfluence value srenghs, whle he propagaon s occurrng regardless hese values. w = c y y. (6)

5 Wh SNNs, spkes are he ones, whch are beng propagaed over he nework. Unlke radonal neural neworks, here are condons requred for he process o occur, snce spkes are generaed only a ceran condons. Ths s a noceable praccal problem. Algorhm run_spkng_neuron () me sep of operaon ϑ hreshold o be reached by acvaon of he neuron ax axonal delay for he neuron Begn Compue acvaon u by processng mngs of ncomng spkes and weghs (Eq. (3)) If u ϑ Then % acvaon reaches hreshold Generae spke a me sep Delay he spke by ax o become ncomng spke for subsequen neurons (See Fg. 1) Fg. 3. Operaon of a spkng neuron a fxed me sep. 2.3 The Learnng Mehod Usng Adapve Layer Acvaon In [12], a Hebban rule-based mehod for learnng n SNNs s proposed. I s bul consderng he followng predefned prncples (no mandaory for SNNs n general): 1) Wegh changes are performed only afer he neuron has us fred. Thus an addonal mechansm s requred o ensure spkes o be generaed; 2) Wegh changes are performed only upwards (as an award for a correc mng of a spke, Eq. (7)). Thus an addonal mechansm s requred o regulae wegh values o avod hem all o become oo bg. w ( f ) = α ( w w ) ε ( ), (7) ( f ) F where α s he learnng rae; w s a consan represenng maxmum allowed wegh value, ε he neurons response o presynapc spkes. Algorhm run_spkng_neuron_adapve (, β) me sep of operaon β acvaon acceleraon facor of he layer ϑ hreshold o be reached by acvaon of he neuron ax axonal delay for he neuron Begn Compue acvaon u by processng mngs of ncomng spkes and weghs (Eq. (3)) If β u ϑ Then % (acceleraed) acvaon reaches hreshold Generae spke a me sep Delay he spke by ax o become ncomng spke for subsequen neurons Fg. 4. Operaon of a spkng neuron wh adapve acceleraon facor. Acvaon acceleraon facor β deermnes sensvy of he neuron. These condons were obeyed hrough nroducng he followng key nnovaons (See he algorhm n Fgs. 4 and 5):

6 1) Adusmen of he average acvaon sae of neurons n each layer n order o ensure a deermned average rhyhm of spkes whn a layer; 2) Manenance of he fxed average volume of synapc wegh n each neuron n order o avod a suaon n whch all he weghs converge o maxmum possble value. Algorhm learnng_adapve_layer_acvaon () me sep of operaon β acvaon acceleraon facor of he layer (n he begnnng s se o 1.0) ϑ hreshold o be reached by acvaon of he neuron run_spkng_neuron_adapve algorhm o operae one neuron wh adapve acceleraon facor Begn Forall neurons n he layer Call run_spkng_neuron_adapve (, β) % run neuron (See he algorhm n Fg. 4) If neuron has us fred Then Forall weghs n he neuron Increase wegh proporonally o he respecve ncomng acvaon (Eq. (7)) Decrease unformly all he weghs o keep a fxed average wegh value of he neuron Updae layer sascs abou neuron acvy n he layer (frequency of spkes) If neurons acvy s srong (accordng o he sascs) Then Decrease β Else Increase β Fg. 5. Learnng wh mehod of adapve layer acvaon for one layer. In addon o neuron weghs, also he acvaon acceleraon facor β undergoes learnng. 3 Random Spkes and Concurren Inpu Layers Alhough spkng neural neworks are beleved o be very powerful, hey are complcaed o operae, and ranng hem s sll a challengng ssue. In hs secon, wo dfferen deas have been proposed o operae a knd of SNNs (bul accordng o he prncples saed n he prevous secon): 1) Inecng random spkes o nduce learnng of an SNN; 2) Exendng nework srucure wh concurren npu layers. The proposed mehods help o realze a way, how an SNN s se o operae. The expermenal resuls show hese mehods o help o successfully overcome varous problems o buld SNNs. 3.1 The Learnng Mehod Usng Random Spkes Applyng effecs of randomness s a mehod already used o exend negrae-and-fre neural neworks [11]. Ths secon nroduces he approach of usng random spkes n learnng wh SNNs. The curren research s connuaon of he prevous work, where he followng prncple n operaon of an SNN was assumed: learnng whn a neuron s performed only afer he neuron has us fred,.e., has us generaed a spke. Snce normally

7 frng s accomplshed only when acvaon of he neuron reaches a ceran hreshold value, he problem o overcome s how o provde a neuron wh condons o frequenly fre n order o undergo learnng. In he prevous research, hs problem was proposed o solve by usng he acvaon acceleraon facor. The man dea of s ha he requremens of frng are beng urned down, f neural acvy n he layer s nsuffcen. Such a suaon s ypcal a he very begnnng of he learnng process. In he new approach, hs prncple s subsued by generang random spkes wh a ceran probably. Ths prncple ensures he mnmum acvy of a neuron, and by hs, also he learnng process o happen. Roughly speakng, he new mehod has he effec smlar o he one of he mehod of adapve layer acvaon, ye s smpler and so much easer o mplemen. The algorhm n Fg. 6 unvels he prncple of random spkes used n operaon of a neuron. The algorhm n Fg. 7 shows he learnng algorhm for a whole layer usng hs prncple. Adusng weghs of neurons remans he same as n he prevous approach. Algorhm run_spkng_neuron_random_spkes (, ρ) me sep of operaon ρ random spke probably (0 ρ << 1) ϑ hreshold o be reached by acvaon of he neuron ax axonal delay for he neuron ge_random_value generaes a random value from nerval 0..1 Begn Compue acvaon u by processng mngs of ncomng spkes and weghs (Eq. (3)) fre False If u ϑ Then % acvaon reaches hreshold fre True Elsef ge_random_value < ρ Then % probably o generae random spke fre True If fre ϑ Then Generae spke a me sep Delay he spke by ax o become ncomng spke for subsequen neurons Fg. 6. Operaon of a spkng neuron wh random spkes. Spkes are generaed also n random o guaranee a ceran frequency. Algorhm learnng_random_spkes (, ρ) me sep of operaon ρ random spke probably (0 ρ << 1) ϑ hreshold o be reached by acvaon of he neuron run_spkng_neuron_random_spkes algorhm o operae one neuron wh random spkes Begn Forall neurons n he layer Call run_spkng_neuron_random_spkes (, ρ) % run neuron (See Fg. 5 for he algorhm) If neuron has us fred Then Forall weghs n he neuron Increase wegh proporonally o he respecve ncomng acvaon (Eq. (7)) Decrease unformly all he weghs o keep a fxed average wegh value of he neuron Fg. 7. Learnng wh he mehod of random spkes for one layer.

8 3.2 Usng Concurren Inpu Layers n Nework Srucure Ths secon proposes an SNN archecure wh concurren npu layers o enhance he ably of a neural nework o deec regulares n an npu. Commonly a neural nework has a lnear feed-forward srucure. Ths also holds for SNNs (Fg. 8). For echncal reasons, npu layers are usually no rue layers: npu values o he whole neural nework are drecly pu o he oupus of he npu layers, so hey don compue anyhng (See he algorhm n Fg. 9). Wh SNNs, he npu layer can addonally have he role of delayng npu sgnals. As axonal delays are predefned n random, npu layers of SNNs provde wh a knd of spaal represenaon of emporal npu nformaon. npu axonal delays npu layer as a se of axonal delays hdden layer oupu layer wh no axonal delays oupu Fg. 8. A smple example of a feed-forward lnear archecure of a spkng neural nework. Connecons beween layers are depced as arrows, and hey denoe all-o-all connecon beween he neurons of boh layers. Algorhm run_neural_nework Begn Do many mes Fech nex npu no X Se X o oupus of he npu layer (n SNNs, drecly generae spkes accordng o X) Process he res of layers n a row Fg. 9. Runnng a neural nework wh lnear srucure and a sngle npu layer. Insead of a sngle npu layer, several concurren npu layers are proposed by he auhor. Snce axonal delays of neurons are predefned random values, bunch of npu layers provde wh several knds of spaal represenaons of npu sgnals. (Fg. 10) The dea s very smple: a each eraon, he npu s duplcaed for all he npu layers (see he algorhm n Fg. 11).

9 By hs, a nework receves several varees of parly delayed npu sgnals, so concurren npu layers serve as a grd of frs level percepon. As he expermenal resuls show, hs srucural feaure leads o faser and beer learnng n an SNN. npu duplcaon of he npu axonal delays concurren npu layers hdden layer oupu layer wh no axonal delays oupu Fg. 10. A spkng neural nework archecure wh concurren npu layers. Algorhm run_neural_nework_concurren_npu_layers Begn Do many mes Fech nex npu no X Forall L n npu layers Do Se X o L Process he res of layers n a row Fg. 11. Runnng a nework wh concurren npus. 4 Expermenal Work To show he nroduced mehods workng, a large expermenal work was accomplshed usng orgnal sofware for ha. Each expermen was run o solve a ask of deecng one knd of emporal paerns n an npu sream. Expermens were carred ou usng several confguraons of SNNs o be able o compare hem n order o mark ou he effec of he wo deas nroduced n hs paper.

10 4.1 Seup of he Expermens The proposed learnng mehod was esed on he deecon of reerave emporal paerns n an npu sream. A raned neural nework should be able o deec such paerns by gvng a spke n oupu afer he paern s encounered. The esng envronmen produces hree bnary sgnals a each dscree me momen and passes hem o he npu of he neural nework. In general, sgnals are generaed a random, bu somemes paerns from a predefned sore are nerwoven no hem. There are sx possble emporal paerns n he sore (Fgure 12). The proporon of 1s and 0s n he random par equals he one n he paerns 20% Fg. 12. The 6 emporal paerns used n expermens: wdh=3 sgnals, lengh=10 mes; black squares denoe 1 or spke, bu whe squares 0 or no spke A each separae expermen, one of he sx paerns was occasonally pu no npu sream (7 paerns of sze 10 whn 200 me seps n average), so he proporon of paerns n he npu sream was =35%. Oupu spkes of raned neworks were analyzed of how hey mach paerns n he npu sream. Paern recognon rae was measured as a rae of responses of a nework o npu paerns a a fxed poson wh regard o he end of a paern. Posons red were of he nerval [-5, +5]. As for nsance, paern recognon rae of 90% means ha we have a poson p, one for whch s rue ha for 90% of paerns of he npu sream, a spke of he neural nework s encounered a he poson p wh regard o paerns. In addon, he res of spkes, false posve ones, also were couned (Fg. 13.). False posve rae refers o he proporon beween false posve spkes and acual paerns n he npu sream. npu paern # oupu false posve spkes a correc h, f he fxed poson s of 2 Fg. 13. An excerp of an npu sream, and examples of responses of he nework o. Here he value of poson can be chosen as 2, so he paern recognon rae s compued as 100% (as he only paern s recognzed), and he false posve rae s 2 (wo false posve spkes agans one paern).

11 To show he nroduced mehods workng, a oal of 15,000 expermens was conduced. Each expermen conssed of 20,000 eraons n he learnng phase, and hen 200 eraons n he recall phase, he resuls of whch were recorded and furher analyzed. Fve dfferen confguraons of SNNs were used (Table 1) accordng wh he general srucure shown n Fg. 10 and he algorhm n Fg 7. Dfferen confguraons of SNNs were used o be able o compare hem n order o mark ou he effec of he wo deas nroduced n hs paper: random spkes and concurren npu layers. The funconaly of SNNs was bul as descrbed n secons 2.1 and 3; wh random spke probably 4%, hreshold 0.3, and axonal delays randomly se from nerval [0, 4]. Along analyzng drec resuls of he expermens, also sde affecs were deermned, hose of false posves and poson of paern deecon. Table 1. Nework confguraons used n he expermenaon. Name of he confguraon Usage of random Nework archecure descrpon (for all spkes ypes, here are 3 npus and 1 oupu) A. Random=N No 4 concurren npus, 12 hdden neurons B. Random=N No 8 concurren npus, 24 hdden neurons C. Random=Y Yes 4 concurren npus, 12 hdden neurons D. Random=Y Yes no concurren npus, 24 hdden neurons E. Random=Y Yes 8 concurren npus, 24 hdden neurons 4.2 Analyss of he Resuls Fg. 14 shows he general overvew of expermenal resuls for he fve confguraons of SNNs. The confguraons C, D, and E show he posve effec of usng he random spkes o recognon. The confguraon E shows he mpac of addonal concurren npu layers, whle D and E show he effec of boh concurren npu layers and a suffcenly szed hdden layer. The confguraon D (n conras o C) mgh sgnal a large hdden layer o brng he same effec as concurren npu layers do, however suffers from nsably of deecon (see Fg. 16). The bes resuls were obaned by obeyng all he followng condons: 1) Usng random spkes; 2) Usng more concurren npu layers; 3) Usng a hdden layer wh a suffcen sze. The man resuls demonsrae SNNs o be successfully raned o deec emporal paerns. Durng expermens, here also were undesrable sde effecs, whch should be reduced durng furher research.

12 120% Paern recognon rae 100% 80% 60% 40% 20% 0% 22% A. Random=N 4x % B. Random=N 8x % C. Random=Y 4x % D. Random=Y 1x % E. Random=Y 8x Confguraon of SNNs Fg. 14. Paern recognon rae obaned for varous confguraons of SNNs. For he bes resuls, random spkes and concurren npu layers should be used. The mos noceable one s ha of false posves (Fg. 15). The resuls show ha, along successful deecon of npu paerns, a lo of redundan spkes are addonally generaed by a neural nework. 'False posve' rae A. Random=N 4x B. Random=N 8x C. Random=Y 4x D. Random=Y 1x E. Random=Y 8x Confguraon of SNNs Fg. 15. Hgh false posve raes sgnal he algorhms o be furher mproved and/or beer confgured/seup. Fg. 16 shows how he archecure of he nework nfluences he lengh of deeced paerns. The neworks of he confguraons C and D deec mos paerns a earler posons (See also Fg. 13), whle he sronger neworks (E) ypcally deec paerns almos a he end poson (-1), moreover he poson of deecon was much more deermned (50% for he posons of hghes occurrence). The confguraon D s especally unsable, so has such a good recognon rae (Fg. 14) us because of he mehodology used.

13 60.0% Occurrence rae 50.0% 40.0% 30.0% 20.0% 10.0% 0.0% Poson wh regard o end of paern C. 4x D. 1x E. 8x Fg. 16. Occurrence dsrbuon of posons (wh regard o he ends of paerns) a whch he bes paern recognon rae was obaned (only for he hree confguraons wh random spkes). The confguraon E shows he mos sable resuls. 5 Concluson Spkng neural neworks are beleved o be very powerful; for all ha hey are comparavely complcaed, especally n erms of learnng. In hs paper, wo deas have been nroduced o enhance learnng and operaon of a varey of SNNs (characerzed by he prncple ha learnng whn a neuron s performed only afer he neuron has us fred): 1) Usng random spkes o addonally smulae he neuron; 2) Usng several concurren npu layers o beer preprocess he npu. The prncple of random spkes ensures he mnmum acvy of a neuron, and by hs, also he learnng process o happen. Expermenal resuls show essenal for a good recognon rae. Alhough he expermens confrm he sgnfcance of hdden layers wh suffcen szes, ye for he bes resuls, concurren npu layers are very mporan. The proposed approach furnshes an effec o learnng of SNNs smlar o one n he prevous research; neverheless he new mehods are less complcaed and have demonsraed o be more sable. References 1. Burk, A.N.: A revew of he negrae-and-fre neuron model: I. Homogeneous synapc npu. In: Bologcal Cybernecs, Volume 95, Number 1, 2006, pp Sprnger (2006) 2. Burk, A.N.: A revew of he negrae-and-fre neuron model: I. Inhomogeneous synapc npu and nework properes. In: Bologcal Cybernecs, Volume 95, Number 2, 2006, pp Sprnger (2006) 3. Fause, L.: Fundamenals of Neural Neworks. Prence-Hall, Inc. (1993)

14 4. Gersner, W.: Spkng Neurons. In: Maass, W., Bshop, C.M. (Eds.) Pulsed neural neworks. MIT Press (1999) 5. Hebb, D.: The organzaon of behavor. John Wley and Sons, New York (1949). 6. Izhkevch, E.M.: Smple Model of Spkng Neuron. In: IEEE Transacons on Neural Neworks, Volume 14, Number 6, November IEEE (2003) 7. Kemper, R., Gersner, W., van Hemmen, L.: Hebban learnng and spkng neurons. Physcal Revew E, 59: (1999) 8. Maass, W.: Compung wh Spkng Neurons. In: Maass, W., Bshop, C.M. (Eds.) Pulsed neural neworks. MIT Press (1999) 9. Maass, W.: Neworks of spkng neurons: he hrd generaon of neural nework models. In: Transacons of he Socey for Compuer Smulaon Inernaonal, vol. 14, ssue 4 (December 1997), pp (1997) 10. Salerno, M., Casal, D., Consann, G., Caroa, M.: Fully Asynchronous Neural Paradgm. In: Proceedngs of he 15h IEEE Mederranean Elecroechncal Conference (MELECON 2010), pp IEEE (2010) 11. Seung, H.S.: Learnng n Spkng Neural Neworks by Renforcemen of Sochasc Synapc Transmsson. In: Neuron, Volume 40, Number 6, 18 December 2003, pp Elsever (2003) 12. Zuers, J.: Spkng Neural Neworks o Deec Temporal Paerns. In: Haav, H.M., Kala A. (eds.) Froners n Arfcal Inellgence and Applcaons, vol. 187, Daabases and Informaon Sysems V - Seleced Papers from he Eghh Inernaonal Balc Conference, DB&IS 2008, pp (2009)

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Journal of Appled Mahemacs and Compuaonal Mechancs 3, (), 45-5 HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Sansław Kukla, Urszula Sedlecka Insue of Mahemacs,

More information

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015 /4/ Learnng Objecves Self Organzaon Map Learnng whou Exaples. Inroducon. MAXNET 3. Cluserng 4. Feaure Map. Self-organzng Feaure Map 6. Concluson 38 Inroducon. Learnng whou exaples. Daa are npu o he syse

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

Cubic Bezier Homotopy Function for Solving Exponential Equations

Cubic Bezier Homotopy Function for Solving Exponential Equations Penerb Journal of Advanced Research n Compung and Applcaons ISSN (onlne: 46-97 Vol. 4, No.. Pages -8, 6 omoopy Funcon for Solvng Eponenal Equaons S. S. Raml *,,. Mohamad Nor,a, N. S. Saharzan,b and M.

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,

More information

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS THE PREICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS INTROUCTION The wo dmensonal paral dfferenal equaons of second order can be used for he smulaon of compeve envronmen n busness The arcle presens he

More information

Existence and Uniqueness Results for Random Impulsive Integro-Differential Equation

Existence and Uniqueness Results for Random Impulsive Integro-Differential Equation Global Journal of Pure and Appled Mahemacs. ISSN 973-768 Volume 4, Number 6 (8), pp. 89-87 Research Inda Publcaons hp://www.rpublcaon.com Exsence and Unqueness Resuls for Random Impulsve Inegro-Dfferenal

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

Tight results for Next Fit and Worst Fit with resource augmentation

Tight results for Next Fit and Worst Fit with resource augmentation Tgh resuls for Nex F and Wors F wh resource augmenaon Joan Boyar Leah Epsen Asaf Levn Asrac I s well known ha he wo smple algorhms for he classc n packng prolem, NF and WF oh have an approxmaon rao of

More information

Increasing the Probablility of Timely and Correct Message Delivery in Road Side Unit Based Vehicular Communcation

Increasing the Probablility of Timely and Correct Message Delivery in Road Side Unit Based Vehicular Communcation Halmsad Unversy For he Developmen of Organsaons Producs and Qualy of Lfe. Increasng he Probablly of Tmely and Correc Message Delvery n Road Sde Un Based Vehcular Communcaon Magnus Jonsson Krsna Kuner and

More information

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair TECHNI Inernaonal Journal of Compung Scence Communcaon Technologes VOL.5 NO. July 22 (ISSN 974-3375 erformance nalyss for a Nework havng Sby edundan Un wh ang n epar Jendra Sngh 2 abns orwal 2 Deparmen

More information

Reinforcement Learning in Associative Memory

Reinforcement Learning in Associative Memory Renforcemen Learnng n Assocave Memory Shaojuan Zhu and Dan Hammersrom Cener for Bologcally Inspred Informaon Engneerng, ECE Deparmen OGI School of Scence and Engneerng a Oregon Healh & Scence Unversy {zhusj,

More information

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015)

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015) 5h Inernaonal onference on Advanced Desgn and Manufacurng Engneerng (IADME 5 The Falure Rae Expermenal Sudy of Specal N Machne Tool hunshan He, a, *, La Pan,b and Bng Hu 3,c,,3 ollege of Mechancal and

More information

A Deterministic Algorithm for Summarizing Asynchronous Streams over a Sliding Window

A Deterministic Algorithm for Summarizing Asynchronous Streams over a Sliding Window A Deermnsc Algorhm for Summarzng Asynchronous Sreams over a Sldng ndow Cosas Busch Rensselaer Polyechnc Insue Srkana Trhapura Iowa Sae Unversy Oulne of Talk Inroducon Algorhm Analyss Tme C Daa sream: 3

More information

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5 TPG460 Reservor Smulaon 08 page of 5 DISCRETIZATIO OF THE FOW EQUATIOS As we already have seen, fne dfference appromaons of he paral dervaves appearng n he flow equaons may be obaned from Taylor seres

More information

Chapter 4. Neural Networks Based on Competition

Chapter 4. Neural Networks Based on Competition Chaper 4. Neural Neworks Based on Compeon Compeon s mporan for NN Compeon beween neurons has been observed n bologcal nerve sysems Compeon s mporan n solvng many problems To classfy an npu paern _1 no

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION INTERNATIONAL TRADE T. J. KEHOE UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 27 EXAMINATION Please answer wo of he hree quesons. You can consul class noes, workng papers, and arcles whle you are workng on he

More information

II. Light is a Ray (Geometrical Optics)

II. Light is a Ray (Geometrical Optics) II Lgh s a Ray (Geomercal Opcs) IIB Reflecon and Refracon Hero s Prncple of Leas Dsance Law of Reflecon Hero of Aleandra, who lved n he 2 nd cenury BC, posulaed he followng prncple: Prncple of Leas Dsance:

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

Polymerization Technology Laboratory Course

Polymerization Technology Laboratory Course Prakkum Polymer Scence/Polymersaonsechnk Versuch Resdence Tme Dsrbuon Polymerzaon Technology Laboraory Course Resdence Tme Dsrbuon of Chemcal Reacors If molecules or elemens of a flud are akng dfferen

More information

Time-interval analysis of β decay. V. Horvat and J. C. Hardy

Time-interval analysis of β decay. V. Horvat and J. C. Hardy Tme-nerval analyss of β decay V. Horva and J. C. Hardy Work on he even analyss of β decay [1] connued and resuled n he developmen of a novel mehod of bea-decay me-nerval analyss ha produces hghly accurae

More information

Comb Filters. Comb Filters

Comb Filters. Comb Filters The smple flers dscussed so far are characered eher by a sngle passband and/or a sngle sopband There are applcaons where flers wh mulple passbands and sopbands are requred Thecomb fler s an example of

More information

Implementation of Quantized State Systems in MATLAB/Simulink

Implementation of Quantized State Systems in MATLAB/Simulink SNE T ECHNICAL N OTE Implemenaon of Quanzed Sae Sysems n MATLAB/Smulnk Parck Grabher, Mahas Rößler 2*, Bernhard Henzl 3 Ins. of Analyss and Scenfc Compung, Venna Unversy of Technology, Wedner Haupsraße

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

Department of Economics University of Toronto

Department of Economics University of Toronto Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of

More information

Boosted LMS-based Piecewise Linear Adaptive Filters

Boosted LMS-based Piecewise Linear Adaptive Filters 016 4h European Sgnal Processng Conference EUSIPCO) Boosed LMS-based Pecewse Lnear Adapve Flers Darush Kar and Iman Marvan Deparmen of Elecrcal and Elecroncs Engneerng Blken Unversy, Ankara, Turkey {kar,

More information

Efficient Asynchronous Channel Hopping Design for Cognitive Radio Networks

Efficient Asynchronous Channel Hopping Design for Cognitive Radio Networks Effcen Asynchronous Channel Hoppng Desgn for Cognve Rado Neworks Chh-Mn Chao, Chen-Yu Hsu, and Yun-ng Lng Absrac In a cognve rado nework (CRN), a necessary condon for nodes o communcae wh each oher s ha

More information

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process Neural Neworks-Based Tme Seres Predcon Usng Long and Shor Term Dependence n he Learnng Process J. Puchea, D. Paño and B. Kuchen, Absrac In hs work a feedforward neural neworksbased nonlnear auoregresson

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

Lecture 2 L n i e n a e r a M od o e d l e s

Lecture 2 L n i e n a e r a M od o e d l e s Lecure Lnear Models Las lecure You have learned abou ha s machne learnng Supervsed learnng Unsupervsed learnng Renforcemen learnng You have seen an eample learnng problem and he general process ha one

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

Bandlimited channel. Intersymbol interference (ISI) This non-ideal communication channel is also called dispersive channel

Bandlimited channel. Intersymbol interference (ISI) This non-ideal communication channel is also called dispersive channel Inersymol nererence ISI ISI s a sgnal-dependen orm o nererence ha arses ecause o devaons n he requency response o a channel rom he deal channel. Example: Bandlmed channel Tme Doman Bandlmed channel Frequency

More information

Reactive Methods to Solve the Berth AllocationProblem with Stochastic Arrival and Handling Times

Reactive Methods to Solve the Berth AllocationProblem with Stochastic Arrival and Handling Times Reacve Mehods o Solve he Berh AllocaonProblem wh Sochasc Arrval and Handlng Tmes Nsh Umang* Mchel Berlare* * TRANSP-OR, Ecole Polyechnque Fédérale de Lausanne Frs Workshop on Large Scale Opmzaon November

More information

A Novel Efficient Stopping Criterion for BICM-ID System

A Novel Efficient Stopping Criterion for BICM-ID System A Novel Effcen Soppng Creron for BICM-ID Sysem Xao Yng, L Janpng Communcaon Unversy of Chna Absrac Ths paper devses a novel effcen soppng creron for b-nerleaved coded modulaon wh erave decodng (BICM-ID)

More information

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code Usng Fuzzy Paern Recognon o Deec Unknown Malcous Execuables Code Boyun Zhang,, Janpng Yn, and Jngbo Hao School of Compuer Scence, Naonal Unversy of Defense Technology, Changsha 40073, Chna hnxzby@yahoo.com.cn

More information

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer d Model Cvl and Surveyng Soware Dranage Analyss Module Deenon/Reenon Basns Owen Thornon BE (Mech), d Model Programmer owen.hornon@d.com 4 January 007 Revsed: 04 Aprl 007 9 February 008 (8Cp) Ths documen

More information

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim Korean J. Mah. 19 (2011), No. 3, pp. 263 272 GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS Youngwoo Ahn and Kae Km Absrac. In he paper [1], an explc correspondence beween ceran

More information

Chapter 6: AC Circuits

Chapter 6: AC Circuits Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.

More information

Example: MOSFET Amplifier Distortion

Example: MOSFET Amplifier Distortion 4/25/2011 Example MSFET Amplfer Dsoron 1/9 Example: MSFET Amplfer Dsoron Recall hs crcu from a prevous handou: ( ) = I ( ) D D d 15.0 V RD = 5K v ( ) = V v ( ) D o v( ) - K = 2 0.25 ma/v V = 2.0 V 40V.

More information

P R = P 0. The system is shown on the next figure:

P R = P 0. The system is shown on the next figure: TPG460 Reservor Smulaon 08 page of INTRODUCTION TO RESERVOIR SIMULATION Analycal and numercal soluons of smple one-dmensonal, one-phase flow equaons As an nroducon o reservor smulaon, we wll revew he smples

More information

Volatility Interpolation

Volatility Interpolation Volaly Inerpolaon Prelmnary Verson March 00 Jesper Andreasen and Bran Huge Danse Mares, Copenhagen wan.daddy@danseban.com brno@danseban.com Elecronc copy avalable a: hp://ssrn.com/absrac=69497 Inro Local

More information

CONSISTENT EARTHQUAKE ACCELERATION AND DISPLACEMENT RECORDS

CONSISTENT EARTHQUAKE ACCELERATION AND DISPLACEMENT RECORDS APPENDX J CONSSTENT EARTHQUAKE ACCEERATON AND DSPACEMENT RECORDS Earhqake Acceleraons can be Measred. However, Srcres are Sbjeced o Earhqake Dsplacemens J. NTRODUCTON { XE "Acceleraon Records" }A he presen

More information

Appendix to Online Clustering with Experts

Appendix to Online Clustering with Experts A Appendx o Onlne Cluserng wh Expers Furher dscusson of expermens. Here we furher dscuss expermenal resuls repored n he paper. Ineresngly, we observe ha OCE (and n parcular Learn- ) racks he bes exper

More information

Video-Based Face Recognition Using Adaptive Hidden Markov Models

Video-Based Face Recognition Using Adaptive Hidden Markov Models Vdeo-Based Face Recognon Usng Adapve Hdden Markov Models Xaomng Lu and suhan Chen Elecrcal and Compuer Engneerng, Carnege Mellon Unversy, Psburgh, PA, 523, U.S.A. xaomng@andrew.cmu.edu suhan@cmu.edu Absrac

More information

Let s treat the problem of the response of a system to an applied external force. Again,

Let s treat the problem of the response of a system to an applied external force. Again, Page 33 QUANTUM LNEAR RESPONSE FUNCTON Le s rea he problem of he response of a sysem o an appled exernal force. Agan, H() H f () A H + V () Exernal agen acng on nernal varable Hamlonan for equlbrum sysem

More information

On computing differential transform of nonlinear non-autonomous functions and its applications

On computing differential transform of nonlinear non-autonomous functions and its applications On compung dfferenal ransform of nonlnear non-auonomous funcons and s applcaons Essam. R. El-Zahar, and Abdelhalm Ebad Deparmen of Mahemacs, Faculy of Scences and Humanes, Prnce Saam Bn Abdulazz Unversy,

More information

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably

More information

Approximate, Computationally Efficient Online Learning in Bayesian Spiking Neurons

Approximate, Computationally Efficient Online Learning in Bayesian Spiking Neurons Approxmae, Compuaonally Effcen Onlne Learnng n Bayesan Spkng Neurons Levn Kuhlmann levnk@unmelb.edu.au NeuroEngneerng Laboraory, Deparmen of Elecrcal & Elecronc Engneerng, The Unversy of Melbourne, and

More information

Graduate Macroeconomics 2 Problem set 5. - Solutions

Graduate Macroeconomics 2 Problem set 5. - Solutions Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K

More information

2.1 Constitutive Theory

2.1 Constitutive Theory Secon.. Consuve Theory.. Consuve Equaons Governng Equaons The equaons governng he behavour of maerals are (n he spaal form) dρ v & ρ + ρdv v = + ρ = Conservaon of Mass (..a) d x σ j dv dvσ + b = ρ v& +

More information

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy

More information

Algorithm Research on Moving Object Detection of Surveillance Video Sequence *

Algorithm Research on Moving Object Detection of Surveillance Video Sequence * Opcs and Phooncs Journal 03 3 308-3 do:0.436/opj.03.3b07 Publshed Onlne June 03 (hp://www.scrp.org/journal/opj) Algorhm Research on Movng Objec Deecon of Survellance Vdeo Sequence * Kuhe Yang Zhmng Ca

More information

Chapter Lagrangian Interpolation

Chapter Lagrangian Interpolation Chaper 5.4 agrangan Inerpolaon Afer readng hs chaper you should be able o:. dere agrangan mehod of nerpolaon. sole problems usng agrangan mehod of nerpolaon and. use agrangan nerpolans o fnd deraes and

More information

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy Arcle Inernaonal Journal of Modern Mahemacal Scences, 4, (): - Inernaonal Journal of Modern Mahemacal Scences Journal homepage: www.modernscenfcpress.com/journals/jmms.aspx ISSN: 66-86X Florda, USA Approxmae

More information

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes.

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes. umercal negraon of he dffuson equaon (I) Fne dfference mehod. Spaal screaon. Inernal nodes. R L V For hermal conducon le s dscree he spaal doman no small fne spans, =,,: Balance of parcles for an nernal

More information

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA

More information

SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β

SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β SARAJEVO JOURNAL OF MATHEMATICS Vol.3 (15) (2007), 137 143 SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β M. A. K. BAIG AND RAYEES AHMAD DAR Absrac. In hs paper, we propose

More information

Political Economy of Institutions and Development: Problem Set 2 Due Date: Thursday, March 15, 2019.

Political Economy of Institutions and Development: Problem Set 2 Due Date: Thursday, March 15, 2019. Polcal Economy of Insuons and Developmen: 14.773 Problem Se 2 Due Dae: Thursday, March 15, 2019. Please answer Quesons 1, 2 and 3. Queson 1 Consder an nfne-horzon dynamc game beween wo groups, an ele and

More information

CS286.2 Lecture 14: Quantum de Finetti Theorems II

CS286.2 Lecture 14: Quantum de Finetti Theorems II CS286.2 Lecure 14: Quanum de Fne Theorems II Scrbe: Mara Okounkova 1 Saemen of he heorem Recall he las saemen of he quanum de Fne heorem from he prevous lecure. Theorem 1 Quanum de Fne). Le ρ Dens C 2

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

Introduction to. Computer Animation

Introduction to. Computer Animation Inroducon o 1 Movaon Anmaon from anma (la.) = soul, spr, breah of lfe Brng mages o lfe! Examples Characer anmaon (humans, anmals) Secondary moon (har, cloh) Physcal world (rgd bodes, waer, fre) 2 2 Anmaon

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm H ( q, p, ) = q p L( q, q, ) H p = q H q = p H = L Equvalen o Lagrangan formalsm Smpler, bu

More information

New conditioning model for robots

New conditioning model for robots ESANN 011 proceedngs, European Symposum on Arfcal Neural Newors, Compuaonal Inellgence and Machne Learnng. Bruges (Belgum), 7-9 Aprl 011, 6doc.com publ., ISBN 978--87419-044-5. Avalable from hp://www.6doc.com/en/lvre/?gcoi=8001100817300.

More information

Sampling Procedure of the Sum of two Binary Markov Process Realizations

Sampling Procedure of the Sum of two Binary Markov Process Realizations Samplng Procedure of he Sum of wo Bnary Markov Process Realzaons YURY GORITSKIY Dep. of Mahemacal Modelng of Moscow Power Insue (Techncal Unversy), Moscow, RUSSIA, E-mal: gorsky@yandex.ru VLADIMIR KAZAKOV

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm Hqp (,,) = qp Lqq (,,) H p = q H q = p H L = Equvalen o Lagrangan formalsm Smpler, bu wce as

More information

ISSN MIT Publications

ISSN MIT Publications MIT Inernaonal Journal of Elecrcal and Insrumenaon Engneerng Vol. 1, No. 2, Aug 2011, pp 93-98 93 ISSN 2230-7656 MIT Publcaons A New Approach for Solvng Economc Load Dspach Problem Ansh Ahmad Dep. of Elecrcal

More information

Li An-Ping. Beijing , P.R.China

Li An-Ping. Beijing , P.R.China A New Type of Cpher: DICING_csb L An-Png Bejng 100085, P.R.Chna apl0001@sna.com Absrac: In hs paper, we wll propose a new ype of cpher named DICING_csb, whch s derved from our prevous sream cpher DICING.

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

An Effective TCM-KNN Scheme for High-Speed Network Anomaly Detection

An Effective TCM-KNN Scheme for High-Speed Network Anomaly Detection Vol. 24, November,, 200 An Effecve TCM-KNN Scheme for Hgh-Speed Nework Anomaly eecon Yang L Chnese Academy of Scences, Bejng Chna, 00080 lyang@sofware.c.ac.cn Absrac. Nework anomaly deecon has been a ho

More information

e-journal Reliability: Theory& Applications No 2 (Vol.2) Vyacheslav Abramov

e-journal Reliability: Theory& Applications No 2 (Vol.2) Vyacheslav Abramov June 7 e-ournal Relably: Theory& Applcaons No (Vol. CONFIDENCE INTERVALS ASSOCIATED WITH PERFORMANCE ANALYSIS OF SYMMETRIC LARGE CLOSED CLIENT/SERVER COMPUTER NETWORKS Absrac Vyacheslav Abramov School

More information

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC CH.3. COMPATIBILITY EQUATIONS Connuum Mechancs Course (MMC) - ETSECCPB - UPC Overvew Compably Condons Compably Equaons of a Poenal Vecor Feld Compably Condons for Infnesmal Srans Inegraon of he Infnesmal

More information

Detection of Waving Hands from Images Using Time Series of Intensity Values

Detection of Waving Hands from Images Using Time Series of Intensity Values Deecon of Wavng Hands from Images Usng Tme eres of Inensy Values Koa IRIE, Kazunor UMEDA Chuo Unversy, Tokyo, Japan re@sensor.mech.chuo-u.ac.jp, umeda@mech.chuo-u.ac.jp Absrac Ths paper proposes a mehod

More information

TSS = SST + SSE An orthogonal partition of the total SS

TSS = SST + SSE An orthogonal partition of the total SS ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally

More information

National Exams December 2015 NOTES: 04-BS-13, Biology. 3 hours duration

National Exams December 2015 NOTES: 04-BS-13, Biology. 3 hours duration Naonal Exams December 205 04-BS-3 Bology 3 hours duraon NOTES: f doub exss as o he nerpreaon of any queson he canddae s urged o subm wh he answer paper a clear saemen of any assumpons made 2 Ths s a CLOSED

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 0 Canoncal Transformaons (Chaper 9) Wha We Dd Las Tme Hamlon s Prncple n he Hamlonan formalsm Dervaon was smple δi δ Addonal end-pon consrans pq H( q, p, ) d 0 δ q ( ) δq ( ) δ

More information

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas)

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas) Lecure 8: The Lalace Transform (See Secons 88- and 47 n Boas) Recall ha our bg-cure goal s he analyss of he dfferenal equaon, ax bx cx F, where we emloy varous exansons for he drvng funcon F deendng on

More information

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data Anne Chao Ncholas J Goell C seh lzabeh L ander K Ma Rober K Colwell and Aaron M llson 03 Rarefacon and erapolaon wh ll numbers: a framewor for samplng and esmaon n speces dversy sudes cology Monographs

More information

MANY real-world applications (e.g. production

MANY real-world applications (e.g. production Barebones Parcle Swarm for Ineger Programmng Problems Mahamed G. H. Omran, Andres Engelbrech and Ayed Salman Absrac The performance of wo recen varans of Parcle Swarm Opmzaon (PSO) when appled o Ineger

More information

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue. Lnear Algebra Lecure # Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons

More information

10. A.C CIRCUITS. Theoretically current grows to maximum value after infinite time. But practically it grows to maximum after 5τ. Decay of current :

10. A.C CIRCUITS. Theoretically current grows to maximum value after infinite time. But practically it grows to maximum after 5τ. Decay of current : . A. IUITS Synopss : GOWTH OF UNT IN IUIT : d. When swch S s closed a =; = d. A me, curren = e 3. The consan / has dmensons of me and s called he nducve me consan ( τ ) of he crcu. 4. = τ; =.63, n one

More information

Bernoulli process with 282 ky periodicity is detected in the R-N reversals of the earth s magnetic field

Bernoulli process with 282 ky periodicity is detected in the R-N reversals of the earth s magnetic field Submed o: Suden Essay Awards n Magnecs Bernoull process wh 8 ky perodcy s deeced n he R-N reversals of he earh s magnec feld Jozsef Gara Deparmen of Earh Scences Florda Inernaonal Unversy Unversy Park,

More information

Panel Data Regression Models

Panel Data Regression Models Panel Daa Regresson Models Wha s Panel Daa? () Mulple dmensoned Dmensons, e.g., cross-secon and me node-o-node (c) Pongsa Pornchawseskul, Faculy of Economcs, Chulalongkorn Unversy (c) Pongsa Pornchawseskul,

More information

Relative controllability of nonlinear systems with delays in control

Relative controllability of nonlinear systems with delays in control Relave conrollably o nonlnear sysems wh delays n conrol Jerzy Klamka Insue o Conrol Engneerng, Slesan Techncal Unversy, 44- Glwce, Poland. phone/ax : 48 32 37227, {jklamka}@a.polsl.glwce.pl Keywor: Conrollably.

More information

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University Hdden Markov Models Followng a lecure by Andrew W. Moore Carnege Mellon Unversy www.cs.cmu.edu/~awm/uorals A Markov Sysem Has N saes, called s, s 2.. s N s 2 There are dscree meseps, 0,, s s 3 N 3 0 Hdden

More information

Advanced time-series analysis (University of Lund, Economic History Department)

Advanced time-series analysis (University of Lund, Economic History Department) Advanced me-seres analss (Unvers of Lund, Economc Hsor Dearmen) 3 Jan-3 Februar and 6-3 March Lecure 4 Economerc echnues for saonar seres : Unvarae sochasc models wh Box- Jenns mehodolog, smle forecasng

More information