Boosted LMS-based Piecewise Linear Adaptive Filters

Size: px
Start display at page:

Download "Boosted LMS-based Piecewise Linear Adaptive Filters"

Transcription

1 016 4h European Sgnal Processng Conference EUSIPCO) Boosed LMS-based Pecewse Lnear Adapve Flers Darush Kar and Iman Marvan Deparmen of Elecrcal and Elecroncs Engneerng Blken Unversy, Ankara, Turkey {kar, Ibrahm Delbala Turk Telekom Communcaons Servces Inc., Isanbul, Turkey Suleyman Serdar Koza Deparmen of Elecrcal and Elecroncs Engneerng Blken Unversy, Ankara, Turkey Absrac We nroduce he boosng noon exensvely used n dfferen machne learnng applcaons o adapve sgnal processng leraure and mplemen several dfferen adapve flerng algorhms. In hs framework, we have several adapve consuen flers ha run n parallel. For each newly receved npu vecor and observaon par, each fler adaps self based on he performance of he oher adapve flers n he mxure on hs curren daa par. These relave updaes provde he boosng effec such ha he flers n he mxure learn a dfferen arbue of he daa provdng dversy. The oupus of hese consuen flers are hen combned usng adapve mxure approaches. We provde he compuaonal complexy bounds for he boosed adapve flers. The nroduced mehods demonsrae mprovemen n he performances of convenonal adapve flerng algorhms due o he boosng effec. I. INTRODUCTION Boosng s consdered as one of he mos mporan ensemble learnng mehods n he machne learnng leraure [1] [3]. As an ensemble learnng mehod [4], boosng combnes several parallel runnng weakly performng algorhms o buld a fnal srongly performng algorhm, by fndng a lnear combnaon of weak learnng algorhms. However, sgnfcanly less aenon s gven o he dea of boosng n he adapve sgnal processng leraure. To hs end, our goal s a) o use he boosng noon n adapve flerng, b) derve several dfferen adapve flerng algorhms based on he boosng approach c) and demonsrae he nrnsc connecons of boosng wh he adapve mxure mehods [5] and daa reuse algorhms [6] wdely suded n he adapve sgnal processng leraure. Alhough boosng s nally nroduced n he bach seng [],.e., where algorhms boos hemselves over a fxed se of ranng daa, s laer exended o he onlne seng [7]. In he onlne seng, we neher need nor have a fxed se of ranng daa, however, he daa arrves one by one as a sream. Each newly arrvng daa s processed and hen dscarded whou any sorng. The onlne seng s naurally movaed by many real lfe applcaons especally for he ones nvolvng bg daa, where here s no enough sorage space avalable or he consrans of he problem requre nsan processng [8]. However, for our purposes, he onlne seng s especally mporan snce s drecly akn o adapve flerng framework where he sreamng or sequenally arrvng daa s used o adap he nernal parameers of he fler, eher o adapvely learn he underlyng model or o rack he nonsaonary daa sascs [9]. Specfcally, we have m parallel runnng adapve flers ha receve he npu vecors sequenally one by one. Each adapve algorhm can use a dfferen updae, such as he recursve leas squares RLS) updae or leas-mean squares LMS) updae. Afer recevng he npu vecor, each algorhm produces s oupu and hen calculaes s nsananeous error afer he observaon s revealed. These updaes are performed for all he m consuen flers n he mxure. However, n he onlne boosng approaches, hese adapaons a each me proceed n rounds from op o boom, sarng from he frs adapve fler o he las one o acheve he boosng effec [10]. Furhermore, unlke he usual mxure approaches [5], he updae of each adapve fler depends on he prevous adapve flers n he mxure. Based on he performance of he flers from 1 o k on he curren x,d ) par, he k1)h fler may gve more or less emphasze o x,d ) par n s adapaon n order o recfy he msake of he prevous adapve flers. Ths dea s clearly relaed o he adapve mxure algorhms wdely used n he sgnal processng leraure. However, unlke he mxure mehods, he updaes of he consuen flers are no ndependen n boosng mehods. We mplemen our boosng algorhms on pecewse lnear flers, snce such flers delver a sgnfcanly superor performance han lnear flers, wh a comparable complexy [11]. To hs end, we apply he boosng noon o several parallel runnng pecewse lnear LMS-based flers, and nroduce hree dfferen approaches o use he mporance weghs [10]. In he frs approach, weghed updaes, we use he mporance weghs drecly o produce ceran weghed LMS algorhms. In he second approach, daa reuse, we use he mporance weghs o consruc daa reuse adapve algorhms. The hrd approach, random updaes, uses he mporance weghs o decde wheher o updae he consuen flers, based on a random number generaed from a Bernoull dsrbuon wh he parameer equal o he wegh. The random updaes mehod can be effecvely used for bg daa processng [1], due o he reduced complexy. The oupu of he consuen flers s also combned usng a lnear fler o consruc he fnal oupu of he algorhm. The fnal combnaon fler s also updaed /16/$ IEEE 1593

2 016 4h European Sgnal Processng Conference EUSIPCO) usng he LMS algorhm [5]. II. PROBLEM DESCRIPTION AND BACKGROUND All vecors are column vecors and represened wh bold lower case leers. Marces are represened by bold upper case leers. For a vecor a or a marx A), a T or A T ) s he ranspose and TrA) s he race of he marx A. The me ndex s gven n he subscrp,.e., x s he sample a me. We work wh real daa for noaonal smplcy. We denoe he mean of a random varable x as E[x]. We sequenally receve r-dmensonal npu regressor) vecors {x } 1, x R r, and desred daa {d } 1, and esmae d by ˆd = f x ), 1) n whch, f.) s an adapve fler. A each me he esmaon error s gven by e = d ˆd, and s used o updae he parameers of he adapve fler. For presenaon purposes, we assume ha d [ 1, 1], however, our dervaons hold for any bounded bu arbrary desred daa sequences. For example, n he predcon problem d = x 1 and n he channel equalzaon applcaon {d } are he ransmed bs, where x s he receved daa from he channel. In our framework, we do no use any sascal assumpons on he npu vecors or on he desred daa such ha our resuls are guaraneed o hold n an ndvdual sequence manner [13]. Noe ha alhough nonlnear flers can ouperform lnear flers, hey usually undergo overfng, sably, and convergence ssues [11], [14]. Furhermore, nonlnear flers generally have hgher compuaonal complexes, whch lms her use n mos of he real-lfe applcaons [11], [14]. To overcome hese problems, pecewse lnear flers are proposed, whch mgae he overfng and sably ssues, whle offerng a comparable modelng performance o he nonlnear flers [11], [14]. Therefore, n hs paper, we are parcularly neresed n pecewse lnear flers, whch serve as an elegan alernave o lnear flers. We use a pecewse lnear adapve flerng mehod, such ha he desred sgnal s predced as ˆd = N =1 s,w T, x, where s, s he ndcaor funcon of he h regon,.e., s, = 1 f x R, and s, =0oherwse. Noe ha a each me, only one of he s, s s nonzero, whch ndcaes he regon n whch x les. Thus, f x R, we updae only he h lnear fler. As an example, consder -dmensonal npu vecors x, as depced n Fg. 1. Here, we consruc he pecewse lnear fler f such ha ˆd = f x )=s 1, w T 1,x s, w T,x = s w T 1,x 1 s )w T,x, ) Then, f s = 1 we shall updae w 1,, oherwse we shall updae w,, based on he amoun of he error, e. III. BOOSTED LMS ALGORITHMS As shown n Fg., a each eraon, we have m parallel runnng adapve flers wh esmang funcons f k), k) producng esmaes ˆd = f k) x ) of d, k = 1,...,m. Regon Regon 1 T T f x ) x w f x ) x w,, s 0 1, 1, s 1 Drecon vecor Separang hyper-plane Fg. 1: A sample -regon paron of he npu vecor.e., x ) space, whch s -dmensonal n hs example. s deermnes wheher x s n Regon 1 or no. As an example, f we use m lnear flers, ˆdk) = x T w k) s he esmae generaed by he kh consuen fler, and f we use pecewse lnear flers each of whch wh N dfferen regons), ˆdk) hese m flers are hen combned usng he lnear weghs = N =1 s,x T w,. The oupus of z o produce he fnal esmae as ˆd = z T y [5], where 1) m) y [ ˆd,..., ˆd ] T s he vecor of oupus. Afer he desred sgnal d s revealed, he m parallel runnng flers wll be updaed for he nex eraon. Moreover, he lnear combnaon coeffcens z are also updaed usng ordnary LMS mehod, as dealed laer n Secon III-D. Afer d s revealed, he consuen flers, f k), k = 1,...,m, are consecuvely updaed as shown n Fg. from op o boom,.e., frs k = 1 s updaed, hen, k = and fnally k = m s updaed. However, o enhance he performance, we use a boosed updang approach [], such ha, he k1)h fler receves a oal loss parameer, l k1) from he fler f k). l k1) = l k) o compue a wegh λ k), [ ) ] σ d f k) x ), 3). The oal loss parameer l k), ndcaes he sum of he dfferences beween he desred Mean Squared Error MSE), σ, and he squared error of he frs k 1 flers a me. Then, he dfference σ e k) ) s added o l k), o generae l k1), and l k1) s passed o [ he nex consuen fler as shown n Fg.. Here, ) ] σ d f k) x ) measures how much he kh consuen fler s off wh respec o he fnal MSE performance goal. For example, f d = fx )ν for some deermnsc nonlnear funcon f ) and ν s he observaon nose, hen σ can be seleced as an upper bound on he varance of he nose process ν. In hs sense, l k) measures how he consuen flers j =1,...,k are cumulavely performng on d, x ) par wh respec o he fnal performance goal. We hen use he wegh λ k) o updae he kh consuen fler wh one of he mehods weghed updaes, daa reuse, or random updaes, whch wll be explaned laer n he subsecons of hs secon. Our am s o make λ k) large f he frs k 1 consuen flers made large errors on d, so ha he kh fler gves more mporance o d, x ) 1594

3 016 4h European Sgnal Processng Conference EUSIPCO) x Inpu Vecor m) m) m) z ˆm) d - m m) f Adapve Fler Parameers Updae m) 1 m) e m) l Desred Sgnal d - e ) ) dˆ Combnng he resuls of all consuen flers Fnal Esmae ) z ˆ) d ) f Adapve Fler Parameers Updae 3) l - ) 1 ) l Combnaon Weghs ) e 1) 1) 1) z 1 ˆ1) d 1) f Adapve Fler Parameers Updae ) l - 1) 1 Fg. : The block dagram of a boosed adapve flerng sysem ha uses he npu vecor x o produce he fnal esmae ˆd. There are m consuen flers f 1),...,f m), each of whch s an adapve pecewse lnear fler ha k) generaes s own esmae ˆd. The fnal esmae ˆd s a lnear combnaon of he esmaes generaed by all hese consuen flers, wh he combnaon weghs z k) k) s correspondng o ˆd s. The combnaon weghs are sored n a vecor whch s updaed afer each eraon. A me he kh fler s updaed based on he values of λ k) and e k), and provdes he k1)h fler wh l k1) ha s used o compue λ k1). The parameer δ k) ndcaes he average Mean Squared Error MSE) of he kh fler over he frs esmaons, and s used n compung λ k). n order o recfy he performance of he overall sysem. We now explan how o consruc hese weghs, such ha 0 < λ k) 1. To hs end, we se λ 1) = 1, for all, and nroduce a weghng smlar { o [10], [15]. } We defne ) k) cl he weghs as λ k) mn 1, δ k) 1, where δ k) 1 ndcaes an esmae of he kh fler s MSE, and c 0 s a desgn parameer, whch deermnes he dependence of each fler updae on he performance of he prevous flers,.e., c =0corresponds o ndependen updaes, lke he ordnary combnaon of he flers [5], whle a greaer c ndcaes he greaer effec of he prevous flers performance on he wegh λ k) of he curren fler. Here, δ k) 1 s an esmae of he Weghed Mean Squared Error WMSE) of he kh consuen fler over {x } 1 and {d } 1. In he basc ) mplemenaon of onlne boosng [10], [15], 1 δ k) 1 s se o he classfcaon advanage of he weak learners [15], where hs advanage s assumed o be he same for all weak learners from k =1,...,m. In hs paper, o avod usng any a pror knowledge and o be compleely adapve, we choose as he weghed and hresholded MSE of he kh fler up δ k) 1 1) l 1) e o me 1 as δ k) = τ=1 [ ] ) λ k) τ 4 d τ f τ k) x τ ) τ=1 λk) τ, 4) [ ] where f τ k) k) x τ ) hresholds f τ x τ ) no he range [ 1, 1]. Ths hresholdng s necessary o assure ha 0 <δ k) 1, whch guaranees 0 <λ k) 1 for all k =1,...,m and. We pon ou ha δ k) can be calculaed recursvely. Regardng he defnon of δ k) and λ k),fhekh fler s good,.e., f δ k) s small enough, we wll pass less wegh o he nex flers, such ha hose flers can concenrae more on he oher samples. Hence, he flers can ncrease he dversy by concenrang on dfferen pars of he daa [5]. s are larger,.e., close o 1, f mos of he consuen flers, j =1,...,k, have errors larger han σ on d, x ), and smaller,.e., close o 0, f he par d, x ) s easly modeled by he prevous consuen flers such ha he flers k 1,...,m do no need o concenrae more on hs par. Based on hese weghs, we nex nroduce hree approaches o updae he consuen flers, whch are pecewse lnear flers explaned n Secon II updaed usng LMS algorhm. Furhermore, he weghs λ k) A. Drecly Usng λ s o Scale he Learnng Raes Snce 0 <λ k) 1, hese weghs can be drecly used o scale he learnng raes for he LMS updaes. When he kh fler receves he wegh λ k), updaes s fler coeffcens w k),,=1,...,n,as w k),1 = where 0 <μ k) I μ k) λ k) ) λ k) x x T w k) μ k), μk) λ k) x d, 5). Noe ha we can choose μ k) = μ for all k, snce he adapve algorhms work consecuvely from op o boom, and he h lnear fler of each dfferen consuen fler wll have a dfferen learnng rae μ λ k). B. A Daa Reuse Approach Based on he Weghs In hs scenaro, for updang w k),, we use he LMS updae = celkλ k) ) mes, where K s a fxed neger number, n k) o oban he w k) 1 as q 0) = w k),, ) q a) = I μ k) x x T w k) 1 = q n k) q a 1) μ k) x d,a=1,...,n k), ). 6) C. Random Updaes Based on he Weghs In hs scenaro, we use he wegh λ k) o generae random number from a Bernoull dsrbuon, whch equals 1 wh probably λ k), or equals zero wh probably 1 λ k). Then, f hs number s 1, we do he ordnary LMS updae on w k) oherwse we do no.,, 1595

4 016 4h European Sgnal Processng Conference EUSIPCO) Algorhm 1 Boosed LMS wh he proposed mehods 1: Inpu: x,d ) daa sream), m number of LMS pecewse lnear consuen flers runnng n parallel) and σ he desred MSE, upper bound on he error varance). : Inalze he regresson coeffcens w k),1 for each LMS fler and he combnaon coeffcens as z 1 = 1 m [1, 1,...,1]T and for all k se δ k) 0 =0. 3: for =1o T do 4: Receve he regressor daa nsance x 5: Compue he ndcaor funcons s k), for all k s 6: Compue he consuen fler oupus ˆdk) = N =1 sk), xt w k), 7: Produce he fnal esmae ˆd = z T 1) m) [ ˆd,..., ˆd ] T 8: Receve he rue oupu d desred daa) 9: λ 1) =1 l 1) =0 10: for k =1o m do 11: Updae he regresson coeffcens w k), by usng 1: e k) 13: λ k) 14: δ k) 15: Λ k) 16: l k1) LMS and he wegh λ k) nroduced algorhms n Secon III = d { =mn k) ˆd 1, δ k) 1 } ) k) cl [ ] ) = Λ k) 1 δk) 1 λk) 4 d f k) x ) =Λ k) 1 λk) Λ k) 1 λk) [ = l k) σ e k) 17: end for 18: z 1 = ) I μy y T z μy d 19: end for D. The Fnal Algorhm based on one of he ) ] Afer he desred daa d s revealed, we updae he consuen flers as well as he combnaon weghs z. To updae he combnaon weghs, we agan employ an LMS algorhm yeldng z 1 = I μy y T ) z μy d, 7) where μ>0 and y =[ˆd 1) m),..., ˆd ] T. The complee fnal algorhm s gven n Algorhm 1. IV. COMPLEXITY ANALYSIS In hs secon we compare he complexy of he proposed algorhms and fnd an upper bound for he weghs λ k). Suppose ha he npu vecor has a lengh of r,.e., x R r. Each consuen fler performs Or) compuaons o generaes s esmae, and requres Or) compuaons due o updang he lnear flers usng he LMS mehod n her mos basc mplemenaons). We derve he compuaonal complexy of usng he LMS updaes n dfferen boosng scenaros. Snce here are a oal of m consuen flers, all of whch are updaed n weghed samples mehod, hs mehod has a compuaonal cos of order Omr) per each eraon. However, n random updaes, a eraon, he kh fler wll or wll no be updaed wh probables λ k) and 1 λ k) respecvely. Hence, f E [ λ k) ] s upper bounded by λk) < 1, he average compuaonal complexy of he random updaes mehod, m wll be O λ k) r). In he Theorem, we provde suffcen k=1 consrans o have such an upper bound. Furhermore, we can use such a bound for he daa reuse mode as well. In hs case, for each fler f k), we perform he LMS updae λ k) K mes, resulng a compuaonal complexy of order K λ k) m Or)). k=1 The followng heorem deermnes he upper bound λ k) for E [ λ k) ]. Theorem: If he adapve flers converge and acheve a suffcenly small MSE accordng o he proof followng hs Theorem), he followng upper bound s obaned for λ k), gven ha σ s chosen properly, [ E λ k) where γ E ] λ k) = γ σ 1 ζ ln γ) [ ] [ ) ] δ k) 1 and ζ E e k). ) 1 k, 8) I can be sraghforwardly shown ha, hs bound s less han 1 for approprae choces of σ, and reasonable values for he MSE accordng o he proof. Ths heorem saes ha f we adjus σ such ha s achevable,.e., he adapve flers can provde a slghly lower MSE han σ, he probably of updang he flers n he random updaes scenaro wll decrease. Ths s of course our desred resul, snce f he flers are performng suffcenly well, here s no need for addonal updaes. Moreover, f σ s oped such ha he flers canno acheve a MSE equal o σ, he flers have o be updaed a each eraon, whch ncreases he complexy. Oulne of he proof: For smplcy, n hs proof, we have assumed ha c =1, however, he resuls are readly exended s are ndependen and dencally dsrbued..d) zero-mean Gaussan random varables wh varance ζ. I can be shown ha we acheve he saed upper bound n he Theorem, under he followng necessary and suffcen condons: o he general values of c. Assume ha e k) δ k) 1 1ζ ln ) σ )) < δ k) 1 1σ ) 4k 1), 9) and 1 ξ 1 )σ ) <ζ 1 ξ )σ < ), 10) 1 σ ln δ k) 1 1 σ ln δ k) 1 where ξ 1 = α 1 σ )α 1 σ ) α 4k 1)δ k) 1 )σ, k 1)δ k) 1 )σ 1596

5 016 4h European Sgnal Processng Conference EUSIPCO) ξ = α 1 σ ) α 1 σ ) α 4k 1)δ k) 1 )σ, k 1)δ k) 1 )σ and ) α 1ζ ln δ k) 1. V. EXPERIMENTS In hs secon, we demonsrae he effcency of he nroduced mehods n a nonsaonary envronmen. These expermens show ha our algorhms can successfully mprove he performance of sngle pecewse lnear flers, and n some cases, even ouperform he convenonal mxure mehod. We have consdered he case where he desred daa s generaed by a nonsaonary pecewse lnear model wh 3 regons. x = [x 1 x ] T s drawn from a jonly Gaussan random process, and hen scaled such ha x [0 1]. However, n hs expermen, we have dvded he oal daa nerval [0 T ] no 4 dsjon nervals, each of lengh T/4, and used a dfferen 3-regon model n each regon. In hs expermen, each boosng algorhm uses 5 consuen flers, each of whch uses a pecewse lnear fler over a -regon paron. The Accumulaed Squared Error ASE) performance of dfferen mehods are compared n Fg. 3. In he Fg. 3, PLMS, MIX, and BPLMS, respecvely show a sngle pecewse lnear LMS fler, he ordnary mxure mehod, and he boosed flers mehods. In addon, he suffxes WU, RU, and DR ndcae he weghed updaes, random updaes, and daa reuse mehods, respecvely. The learnng raes for he LMS-based algorhms are se o 0.0, and he desred MSE parameer σ s se o Also, he drecon vecor for he separang hyperplane s se o θ =[θ 1 θ θ 3 ] T. θ s conssed of hree random varables, each wh mean 1, o consruc random consuen flers. The resuls show he superor performance of our algorhms over he sngle pecewse lnear flers, as well as he mxure mehod, n hs hghly nonsaonary envronmen. Moreover, as shown n Fg. 3 he daa reuse mehod shows a beer performance relave o he oher boosng mehods. However, accordng o Table I he random updaes mehod has a sgnfcanly lower me consumpon, whch makes more desrable for bg daa applcaons. VI. CONCLUSION We nroduce he boosng concep, exensvely suded n machne learnng leraure, o adapve flerng conex, and propose hree dfferen boosng approaches, weghed updaes, daa reuse, and random updaes whch are applcable o dfferen adapve flerng algorhms. We show ha by hese approaches we can mprove he MSE performance of he convenonal LMS flers n pecewse lnear models, and we provde an upper bound for he weghs generaed durng he algorhm, whch lead us o a horough analyss of he complexy of hese mehods. We show ha he complexy of random updaes mehod s remarkably lower han oher wo approaches, whle he MSE performance does no degrade. Accumulaed Squared Error 10-1 Performance Comparson, LMS 10 - PLMS MIX BPLMS-WU BPLMS-RU BPLMS-DR Daa Lengh ) 10 4 Fg. 3: ASE performance TABLE I: Tme comparson of dfferen mehods seconds) LMS-MIX BPLMS-RU BPLMS-WU BPLMS-DR Therefore, he boosng usng random updaes approach can be effcenly appled o real lfe large scale problems. REFERENCES [1] R. E. Schapre and Y. Freund, Boosng: Foundaons and Algorhms, MIT Press, 01. [] Y. Freund and R. E.Schapre, A decson-heorec generalzaon of on-lne learnng and an applcaon o boosng, Journal of Compuer and Sysem Scences, vol. 55, pp , [3] D. L. Shresha and D. P. Solomane, Expermens wh adaboos.r, an mproved boosng scheme for regresson, n Expermens wh AdaBoos.RT, an mproved boosng scheme for regresson, 006. [4] R. O. Duda, P. E. Har, and D. G. Sork, Paern Classfcaon, John Wlley and Sons, 001. [5] S. S. Koza, A. T. Erdogan, A. C. Snger, and A. H. Sayed, Seady sae MSE performance analyss of mxure approaches o adapve flerng, IEEE Transacons on Sgnal Processng, 010. [6] S. Shaffer and C. S. Wllams, Comparson of lms, alpha-lms, and daa reusng lms algorhms, n Conference Record of he Seveneenh Aslomar Conference on Crcus, Sysems and Compuers, [7] N. C. Oza and S. Russell, Onlne baggng and boosng, n Proceedngs of AISTATS, 001. [8] L. Boou and O. Bousque, The radeoffs of large scale learnng, n NIPS, 008. [9] A. H. Sayed, Fundamenals of Adapve Flerng, John Wley and Sons, 003. [10] Shang-Tse Chen, Hsuan-Ten Ln, and Ch-Jen Lu, An onlne boosng algorhm wh heorecal jusfcaons, n ICML, 01. [11] N. D. Vanl and S. S. Koza, A comprehensve approach o unversal pecewse nonlnear regresson based on rees, IEEE Transacons on Sgnal Processng, vol. 6, no. 0, pp , Oc 014. [1] P. Malk, Governng bg daa: Prncples and pracces, IBM J. Res. Dev., vol. 57, no. 3-4, pp. 1:1 1:1, May 013. [13] S. S Koza and A. C. Snger, Unversal swchng lnear leas squares predcon, IEEE Transacons on Sgnal Processng, vol. 56, pp , Jan [14] Suleyman Serdar Koza, Andrew C. Snger, and Georg Zeler, Unversal pecewse lnear predcon va conex rees., IEEE Transacons on Sgnal Processng, vol. 55, no. 7-, pp , 007. [15] R. A. Servedo, Smooh boosng and learnng wh malcous nose, Journal of Machne Learnng Research, vol. 4, pp , 003. [16] A. Papouls and S. U. Plla, Probably, Random Varables, and Sochasc Processes, McGraw-Hll Hgher Educaon, 4 edon,

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

Comb Filters. Comb Filters

Comb Filters. Comb Filters The smple flers dscussed so far are characered eher by a sngle passband and/or a sngle sopband There are applcaons where flers wh mulple passbands and sopbands are requred Thecomb fler s an example of

More information

Machine Learning Linear Regression

Machine Learning Linear Regression Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

Math 128b Project. Jude Yuen

Math 128b Project. Jude Yuen Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally

More information

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy

More information

Cubic Bezier Homotopy Function for Solving Exponential Equations

Cubic Bezier Homotopy Function for Solving Exponential Equations Penerb Journal of Advanced Research n Compung and Applcaons ISSN (onlne: 46-97 Vol. 4, No.. Pages -8, 6 omoopy Funcon for Solvng Eponenal Equaons S. S. Raml *,,. Mohamad Nor,a, N. S. Saharzan,b and M.

More information

( ) [ ] MAP Decision Rule

( ) [ ] MAP Decision Rule Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data Anne Chao Ncholas J Goell C seh lzabeh L ander K Ma Rober K Colwell and Aaron M llson 03 Rarefacon and erapolaon wh ll numbers: a framewor for samplng and esmaon n speces dversy sudes cology Monographs

More information

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study) Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

Volatility Interpolation

Volatility Interpolation Volaly Inerpolaon Prelmnary Verson March 00 Jesper Andreasen and Bran Huge Danse Mares, Copenhagen wan.daddy@danseban.com brno@danseban.com Elecronc copy avalable a: hp://ssrn.com/absrac=69497 Inro Local

More information

Genetic Algorithm in Parameter Estimation of Nonlinear Dynamic Systems

Genetic Algorithm in Parameter Estimation of Nonlinear Dynamic Systems Genec Algorhm n Parameer Esmaon of Nonlnear Dynamc Sysems E. Paeraks manos@egnaa.ee.auh.gr V. Perds perds@vergna.eng.auh.gr Ah. ehagas kehagas@egnaa.ee.auh.gr hp://skron.conrol.ee.auh.gr/kehagas/ndex.hm

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

Department of Economics University of Toronto

Department of Economics University of Toronto Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of

More information

TSS = SST + SSE An orthogonal partition of the total SS

TSS = SST + SSE An orthogonal partition of the total SS ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally

More information

Networked Estimation with an Area-Triggered Transmission Method

Networked Estimation with an Area-Triggered Transmission Method Sensors 2008, 8, 897-909 sensors ISSN 1424-8220 2008 by MDPI www.mdp.org/sensors Full Paper Neworked Esmaon wh an Area-Trggered Transmsson Mehod Vnh Hao Nguyen and Young Soo Suh * Deparmen of Elecrcal

More information

January Examinations 2012

January Examinations 2012 Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

e-journal Reliability: Theory& Applications No 2 (Vol.2) Vyacheslav Abramov

e-journal Reliability: Theory& Applications No 2 (Vol.2) Vyacheslav Abramov June 7 e-ournal Relably: Theory& Applcaons No (Vol. CONFIDENCE INTERVALS ASSOCIATED WITH PERFORMANCE ANALYSIS OF SYMMETRIC LARGE CLOSED CLIENT/SERVER COMPUTER NETWORKS Absrac Vyacheslav Abramov School

More information

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim Korean J. Mah. 19 (2011), No. 3, pp. 263 272 GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS Youngwoo Ahn and Kae Km Absrac. In he paper [1], an explc correspondence beween ceran

More information

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy Arcle Inernaonal Journal of Modern Mahemacal Scences, 4, (): - Inernaonal Journal of Modern Mahemacal Scences Journal homepage: www.modernscenfcpress.com/journals/jmms.aspx ISSN: 66-86X Florda, USA Approxmae

More information

Second-Order Non-Stationary Online Learning for Regression

Second-Order Non-Stationary Online Learning for Regression Second-Order Non-Saonary Onlne Learnng for Regresson Nna Vas, Edward Moroshko, and Koby Crammer, Fellow, IEEE arxv:303040v cslg] Mar 03 Absrac he goal of a learner, n sandard onlne learnng, s o have he

More information

Tight results for Next Fit and Worst Fit with resource augmentation

Tight results for Next Fit and Worst Fit with resource augmentation Tgh resuls for Nex F and Wors F wh resource augmenaon Joan Boyar Leah Epsen Asaf Levn Asrac I s well known ha he wo smple algorhms for he classc n packng prolem, NF and WF oh have an approxmaon rao of

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

Existence and Uniqueness Results for Random Impulsive Integro-Differential Equation

Existence and Uniqueness Results for Random Impulsive Integro-Differential Equation Global Journal of Pure and Appled Mahemacs. ISSN 973-768 Volume 4, Number 6 (8), pp. 89-87 Research Inda Publcaons hp://www.rpublcaon.com Exsence and Unqueness Resuls for Random Impulsve Inegro-Dfferenal

More information

Sampling Procedure of the Sum of two Binary Markov Process Realizations

Sampling Procedure of the Sum of two Binary Markov Process Realizations Samplng Procedure of he Sum of wo Bnary Markov Process Realzaons YURY GORITSKIY Dep. of Mahemacal Modelng of Moscow Power Insue (Techncal Unversy), Moscow, RUSSIA, E-mal: gorsky@yandex.ru VLADIMIR KAZAKOV

More information

Reactive Methods to Solve the Berth AllocationProblem with Stochastic Arrival and Handling Times

Reactive Methods to Solve the Berth AllocationProblem with Stochastic Arrival and Handling Times Reacve Mehods o Solve he Berh AllocaonProblem wh Sochasc Arrval and Handlng Tmes Nsh Umang* Mchel Berlare* * TRANSP-OR, Ecole Polyechnque Fédérale de Lausanne Frs Workshop on Large Scale Opmzaon November

More information

CS286.2 Lecture 14: Quantum de Finetti Theorems II

CS286.2 Lecture 14: Quantum de Finetti Theorems II CS286.2 Lecure 14: Quanum de Fne Theorems II Scrbe: Mara Okounkova 1 Saemen of he heorem Recall he las saemen of he quanum de Fne heorem from he prevous lecure. Theorem 1 Quanum de Fne). Le ρ Dens C 2

More information

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process Neural Neworks-Based Tme Seres Predcon Usng Long and Shor Term Dependence n he Learnng Process J. Puchea, D. Paño and B. Kuchen, Absrac In hs work a feedforward neural neworksbased nonlnear auoregresson

More information

CS 268: Packet Scheduling

CS 268: Packet Scheduling Pace Schedulng Decde when and wha pace o send on oupu ln - Usually mplemened a oupu nerface CS 68: Pace Schedulng flow Ion Soca March 9, 004 Classfer flow flow n Buffer managemen Scheduler soca@cs.bereley.edu

More information

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6) Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen

More information

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA

More information

Graduate Macroeconomics 2 Problem set 5. - Solutions

Graduate Macroeconomics 2 Problem set 5. - Solutions Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K

More information

On computing differential transform of nonlinear non-autonomous functions and its applications

On computing differential transform of nonlinear non-autonomous functions and its applications On compung dfferenal ransform of nonlnear non-auonomous funcons and s applcaons Essam. R. El-Zahar, and Abdelhalm Ebad Deparmen of Mahemacs, Faculy of Scences and Humanes, Prnce Saam Bn Abdulazz Unversy,

More information

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets Forecasng Usng Frs-Order Dfference of Tme Seres and Baggng of Compeve Assocave Nes Shuch Kurog, Ryohe Koyama, Shnya Tanaka, and Toshhsa Sanuk Absrac Ths arcle descrbes our mehod used for he 2007 Forecasng

More information

Algorithm Research on Moving Object Detection of Surveillance Video Sequence *

Algorithm Research on Moving Object Detection of Surveillance Video Sequence * Opcs and Phooncs Journal 03 3 308-3 do:0.436/opj.03.3b07 Publshed Onlne June 03 (hp://www.scrp.org/journal/opj) Algorhm Research on Movng Objec Deecon of Survellance Vdeo Sequence * Kuhe Yang Zhmng Ca

More information

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering CS 536: Machne Learnng Nonparamerc Densy Esmaon Unsupervsed Learnng - Cluserng Fall 2005 Ahmed Elgammal Dep of Compuer Scence Rugers Unversy CS 536 Densy Esmaon - Cluserng - 1 Oulnes Densy esmaon Nonparamerc

More information

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University Hdden Markov Models Followng a lecure by Andrew W. Moore Carnege Mellon Unversy www.cs.cmu.edu/~awm/uorals A Markov Sysem Has N saes, called s, s 2.. s N s 2 There are dscree meseps, 0,, s s 3 N 3 0 Hdden

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair TECHNI Inernaonal Journal of Compung Scence Communcaon Technologes VOL.5 NO. July 22 (ISSN 974-3375 erformance nalyss for a Nework havng Sby edundan Un wh ang n epar Jendra Sngh 2 abns orwal 2 Deparmen

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle

More information

3. OVERVIEW OF NUMERICAL METHODS

3. OVERVIEW OF NUMERICAL METHODS 3 OVERVIEW OF NUMERICAL METHODS 3 Inroducory remarks Ths chaper summarzes hose numercal echnques whose knowledge s ndspensable for he undersandng of he dfferen dscree elemen mehods: he Newon-Raphson-mehod,

More information

Appendix to Online Clustering with Experts

Appendix to Online Clustering with Experts A Appendx o Onlne Cluserng wh Expers Furher dscusson of expermens. Here we furher dscuss expermenal resuls repored n he paper. Ineresngly, we observe ha OCE (and n parcular Learn- ) racks he bes exper

More information

F-Tests and Analysis of Variance (ANOVA) in the Simple Linear Regression Model. 1. Introduction

F-Tests and Analysis of Variance (ANOVA) in the Simple Linear Regression Model. 1. Introduction ECOOMICS 35* -- OTE 9 ECO 35* -- OTE 9 F-Tess and Analyss of Varance (AOVA n he Smple Lnear Regresson Model Inroducon The smple lnear regresson model s gven by he followng populaon regresson equaon, or

More information

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Journal of Appled Mahemacs and Compuaonal Mechancs 3, (), 45-5 HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Sansław Kukla, Urszula Sedlecka Insue of Mahemacs,

More information

A Novel Efficient Stopping Criterion for BICM-ID System

A Novel Efficient Stopping Criterion for BICM-ID System A Novel Effcen Soppng Creron for BICM-ID Sysem Xao Yng, L Janpng Communcaon Unversy of Chna Absrac Ths paper devses a novel effcen soppng creron for b-nerleaved coded modulaon wh erave decodng (BICM-ID)

More information

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method 10 h US Naonal Congress on Compuaonal Mechancs Columbus, Oho 16-19, 2009 Sngle-loop Sysem Relably-Based Desgn & Topology Opmzaon (SRBDO/SRBTO): A Marx-based Sysem Relably (MSR) Mehod Tam Nguyen, Junho

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure

More information

Efficient Asynchronous Channel Hopping Design for Cognitive Radio Networks

Efficient Asynchronous Channel Hopping Design for Cognitive Radio Networks Effcen Asynchronous Channel Hoppng Desgn for Cognve Rado Neworks Chh-Mn Chao, Chen-Yu Hsu, and Yun-ng Lng Absrac In a cognve rado nework (CRN), a necessary condon for nodes o communcae wh each oher s ha

More information

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015)

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015) 5h Inernaonal onference on Advanced Desgn and Manufacurng Engneerng (IADME 5 The Falure Rae Expermenal Sudy of Specal N Machne Tool hunshan He, a, *, La Pan,b and Bng Hu 3,c,,3 ollege of Mechancal and

More information

Comparison of Supervised & Unsupervised Learning in βs Estimation between Stocks and the S&P500

Comparison of Supervised & Unsupervised Learning in βs Estimation between Stocks and the S&P500 Comparson of Supervsed & Unsupervsed Learnng n βs Esmaon beween Socks and he S&P500 J. We, Y. Hassd, J. Edery, A. Becker, Sanford Unversy T I. INTRODUCTION HE goal of our proec s o analyze he relaonshps

More information

Time-interval analysis of β decay. V. Horvat and J. C. Hardy

Time-interval analysis of β decay. V. Horvat and J. C. Hardy Tme-nerval analyss of β decay V. Horva and J. C. Hardy Work on he even analyss of β decay [1] connued and resuled n he developmen of a novel mehod of bea-decay me-nerval analyss ha produces hghly accurae

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition EHEM ALPAYDI he MI Press, 04 Lecure Sldes for IRODUCIO O Machne Learnng 3rd Edon alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/ml3e Sldes from exboo resource page. Slghly eded and wh addonal examples

More information

Testing a new idea to solve the P = NP problem with mathematical induction

Testing a new idea to solve the P = NP problem with mathematical induction Tesng a new dea o solve he P = NP problem wh mahemacal nducon Bacground P and NP are wo classes (ses) of languages n Compuer Scence An open problem s wheher P = NP Ths paper ess a new dea o compare he

More information

Li An-Ping. Beijing , P.R.China

Li An-Ping. Beijing , P.R.China A New Type of Cpher: DICING_csb L An-Png Bejng 100085, P.R.Chna apl0001@sna.com Absrac: In hs paper, we wll propose a new ype of cpher named DICING_csb, whch s derved from our prevous sream cpher DICING.

More information

Filtrage particulaire et suivi multi-pistes Carine Hue Jean-Pierre Le Cadre and Patrick Pérez

Filtrage particulaire et suivi multi-pistes Carine Hue Jean-Pierre Le Cadre and Patrick Pérez Chaînes de Markov cachées e flrage parculare 2-22 anver 2002 Flrage parculare e suv mul-pses Carne Hue Jean-Perre Le Cadre and Parck Pérez Conex Applcaons: Sgnal processng: arge rackng bearngs-onl rackng

More information

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS THE PREICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS INTROUCTION The wo dmensonal paral dfferenal equaons of second order can be used for he smulaon of compeve envronmen n busness The arcle presens he

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press, Lecure Sldes for INTRDUCTIN T Machne Learnng ETHEM ALAYDIN The MIT ress, 2004 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/2ml CHATER 3: Hdden Marov Models Inroducon Modelng dependences n npu; no

More information

Lecture 2 M/G/1 queues. M/G/1-queue

Lecture 2 M/G/1 queues. M/G/1-queue Lecure M/G/ queues M/G/-queue Posson arrval process Arbrary servce me dsrbuon Sngle server To deermne he sae of he sysem a me, we mus now The number of cusomers n he sysems N() Tme ha he cusomer currenly

More information

Survival Analysis and Reliability. A Note on the Mean Residual Life Function of a Parallel System

Survival Analysis and Reliability. A Note on the Mean Residual Life Function of a Parallel System Communcaons n Sascs Theory and Mehods, 34: 475 484, 2005 Copyrgh Taylor & Francs, Inc. ISSN: 0361-0926 prn/1532-415x onlne DOI: 10.1081/STA-200047430 Survval Analyss and Relably A Noe on he Mean Resdual

More information

Joint Channel Estimation and Resource Allocation for MIMO Systems Part I: Single-User Analysis

Joint Channel Estimation and Resource Allocation for MIMO Systems Part I: Single-User Analysis 624 IEEE RANSACIONS ON WIRELESS COUNICAIONS, VOL. 9, NO. 2, FEBRUARY 200 Jon Channel Esmaon and Resource Allocaon for IO Sysems Par I: Sngle-User Analyss Alkan Soysal, ember, IEEE, and Sennur Ulukus, ember,

More information

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment EEL 6266 Power Sysem Operaon and Conrol Chaper 5 Un Commmen Dynamc programmng chef advanage over enumeraon schemes s he reducon n he dmensonaly of he problem n a src prory order scheme, here are only N

More information

A decision-theoretic generalization of on-line learning. and an application to boosting. AT&T Labs. 180 Park Avenue. Florham Park, NJ 07932

A decision-theoretic generalization of on-line learning. and an application to boosting. AT&T Labs. 180 Park Avenue. Florham Park, NJ 07932 A decson-heorec generalzaon of on-lne learnng and an applcaon o boosng Yoav Freund Rober E. Schapre AT&T Labs 80 Park Avenue Florham Park, NJ 07932 fyoav, schapreg@research.a.com December 9, 996 Absrac

More information

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading Onlne Supplemen for Dynamc Mul-Technology Producon-Invenory Problem wh Emssons Tradng by We Zhang Zhongsheng Hua Yu Xa and Baofeng Huo Proof of Lemma For any ( qr ) Θ s easy o verfy ha he lnear programmng

More information

Relative controllability of nonlinear systems with delays in control

Relative controllability of nonlinear systems with delays in control Relave conrollably o nonlnear sysems wh delays n conrol Jerzy Klamka Insue o Conrol Engneerng, Slesan Techncal Unversy, 44- Glwce, Poland. phone/ax : 48 32 37227, {jklamka}@a.polsl.glwce.pl Keywor: Conrollably.

More information

Lecture 2 L n i e n a e r a M od o e d l e s

Lecture 2 L n i e n a e r a M od o e d l e s Lecure Lnear Models Las lecure You have learned abou ha s machne learnng Supervsed learnng Unsupervsed learnng Renforcemen learnng You have seen an eample learnng problem and he general process ha one

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

General Weighted Majority, Online Learning as Online Optimization

General Weighted Majority, Online Learning as Online Optimization Sascal Technques n Robocs (16-831, F10) Lecure#10 (Thursday Sepember 23) General Weghed Majory, Onlne Learnng as Onlne Opmzaon Lecurer: Drew Bagnell Scrbe: Nahanel Barshay 1 1 Generalzed Weghed majory

More information

An Effective TCM-KNN Scheme for High-Speed Network Anomaly Detection

An Effective TCM-KNN Scheme for High-Speed Network Anomaly Detection Vol. 24, November,, 200 An Effecve TCM-KNN Scheme for Hgh-Speed Nework Anomaly eecon Yang L Chnese Academy of Scences, Bejng Chna, 00080 lyang@sofware.c.ac.cn Absrac. Nework anomaly deecon has been a ho

More information

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach A Novel Iron Loss Reducon Technque for Dsrbuon Transformers Based on a Combned Genec Algorhm - Neural Nework Approach Palvos S. Georglaks Nkolaos D. Doulams Anasasos D. Doulams Nkos D. Hazargyrou and Sefanos

More information

A decision-theoretic generalization of on-line learning. and an application to boosting. AT&T Bell Laboratories. 600 Mountain Avenue

A decision-theoretic generalization of on-line learning. and an application to boosting. AT&T Bell Laboratories. 600 Mountain Avenue A decson-heorec generalzaon of on-lne learnng and an applcaon o boosng Yoav Freund Rober E. Schapre AT&T Bell Laboraores 600 Mounan Avenue Room f2b-428, 2A-424g Murray Hll, NJ 07974-0636 fyoav, schapreg@research.a.com

More information

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b Inernaonal Indusral Informacs and Compuer Engneerng Conference (IIICEC 05) Arbue educon Algorhm Based on Dscernbly Marx wh Algebrac Mehod GAO Jng,a, Ma Hu, Han Zhdong,b Informaon School, Capal Unversy

More information

Bayesian Inference of the GARCH model with Rational Errors

Bayesian Inference of the GARCH model with Rational Errors 0 Inernaonal Conference on Economcs, Busness and Markeng Managemen IPEDR vol.9 (0) (0) IACSIT Press, Sngapore Bayesan Inference of he GARCH model wh Raonal Errors Tesuya Takash + and Tng Tng Chen Hroshma

More information

Fitting a Conditional Linear Gaussian Distribution

Fitting a Conditional Linear Gaussian Distribution Fng a Condonal Lnear Gaussan Dsrbuon Kevn P. Murphy 28 Ocober 1998 Revsed 29 January 2003 1 Inroducon We consder he problem of fndng he maxmum lkelhood ML esmaes of he parameers of a condonal Gaussan varable

More information

A Novel Object Detection Method Using Gaussian Mixture Codebook Model of RGB-D Information

A Novel Object Detection Method Using Gaussian Mixture Codebook Model of RGB-D Information A Novel Objec Deecon Mehod Usng Gaussan Mxure Codebook Model of RGB-D Informaon Lujang LIU 1, Gaopeng ZHAO *,1, Yumng BO 1 1 School of Auomaon, Nanjng Unversy of Scence and Technology, Nanjng, Jangsu 10094,

More information

The Analysis of the Thickness-predictive Model Based on the SVM Xiu-ming Zhao1,a,Yan Wang2,band Zhimin Bi3,c

The Analysis of the Thickness-predictive Model Based on the SVM Xiu-ming Zhao1,a,Yan Wang2,band Zhimin Bi3,c h Naonal Conference on Elecrcal, Elecroncs and Compuer Engneerng (NCEECE The Analyss of he Thcknesspredcve Model Based on he SVM Xumng Zhao,a,Yan Wang,band Zhmn B,c School of Conrol Scence and Engneerng,

More information

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer d Model Cvl and Surveyng Soware Dranage Analyss Module Deenon/Reenon Basns Owen Thornon BE (Mech), d Model Programmer owen.hornon@d.com 4 January 007 Revsed: 04 Aprl 007 9 February 008 (8Cp) Ths documen

More information

Hidden Markov Models

Hidden Markov Models 11-755 Machne Learnng for Sgnal Processng Hdden Markov Models Class 15. 12 Oc 2010 1 Admnsrva HW2 due Tuesday Is everyone on he projecs page? Where are your projec proposals? 2 Recap: Wha s an HMM Probablsc

More information

2/20/2013. EE 101 Midterm 2 Review

2/20/2013. EE 101 Midterm 2 Review //3 EE Mderm eew //3 Volage-mplfer Model The npu ressance s he equalen ressance see when lookng no he npu ermnals of he amplfer. o s he oupu ressance. I causes he oupu olage o decrease as he load ressance

More information

Chapter 6: AC Circuits

Chapter 6: AC Circuits Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.

More information